🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 22, 2026

Reddit to Google Sheets, clear sentiment at a glance

Lisa Granqvist Partner Workflow Automation Expert

You open a Reddit thread expecting “a quick scan,” and suddenly you’re 45 minutes deep, still unsure if people loved the update or hated it. The worst part is the uncertainty. You read a handful of comments and still don’t trust your takeaway.

This Reddit sentiment logging automation hits product marketers hard, but founders and community managers feel it too. Instead of manually skimming, you get a clean sheet that shows each comment, its sentiment (positive/neutral/negative), and the AI’s reasoning.

You’ll see how the workflow pulls Reddit comments via Bright Data, analyzes them with Google Gemini, then writes the results into Google Sheets so you can spot trends at a glance.

How This Automation Works

Here’s the complete workflow you’ll be setting up:

n8n Workflow Template: Reddit to Google Sheets, clear sentiment at a glance

Why This Matters: Turning comment chaos into a clear signal

Reddit is brutally honest, which is exactly why it’s useful and exhausting. If you’re tracking brand mentions, a launch thread, or a competitor discussion, the comments move fast and the tone shifts constantly. You read ten comments and think you’ve got it, then the next ten flip the mood entirely. Meanwhile, you still have to brief your team, update stakeholders, or decide what to fix next. Manual scanning doesn’t just take time. It creates “confidence theater,” where you feel informed but you can’t really prove it.

It adds up fast. Here’s where it breaks down in real life.

  • You end up sampling a tiny slice of comments, and your conclusions skew toward whatever you happened to read first.
  • Copying quotes into a doc takes forever, and you still don’t have a consistent sentiment label to compare week over week.
  • It’s easy to miss early warning signs because the “negative” feedback is buried inside long, mixed replies.
  • Sharing insights is messy, because screenshots and highlights don’t translate into something your team can sort, filter, or chart.

What You’ll Build: Reddit comments → Gemini sentiment → Sheets log

This workflow turns any single Reddit post URL into a structured sentiment log in Google Sheets. You start by pasting a Reddit link into n8n and running the workflow. Bright Data handles the heavy lifting by scraping the post’s comments in a way that avoids common rate limits and access headaches. After a short wait while the snapshot is prepared, the workflow downloads the comments, selects a small batch (by default, five), and sends them to a Google Gemini-powered agent. Gemini classifies each comment as positive, negative, or neutral and also explains why it chose that label. Finally, n8n appends the comment text, the sentiment, and the reasoning into a Google Sheet you can filter, share, and reuse for reporting.

The workflow begins with a Reddit URL you control. Then it collects comments, runs them through AI sentiment analysis, and writes the results into Sheets as a clean dataset. After that, you can build charts, track changes over time, or just stop rereading the same threads.

What You’re Building

Expected Results

Say you review 3 Reddit threads a week for brand monitoring, and you typically read about 40 comments per thread at roughly 1 minute each. That’s about 2 hours weekly, and it still leaves you with scattered notes. With this workflow, you paste the URL, wait a few minutes for the snapshot, then review the sheet output (often 5–10 minutes per thread for a quick pulse). Even if you later raise the comment limit, you’re spending time on decisions, not scanning.

Before You Start

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • Bright Data for scraping Reddit comments reliably
  • Google Gemini to score sentiment and provide reasoning
  • Google Sheets access (connect your Google account in n8n)

Skill level: Beginner. You’ll mostly connect accounts, paste a URL, and verify the sheet mapping.

Want someone to build this for you? Talk to an automation expert (free 15-minute consultation).

Step by Step

You provide a Reddit post link. The workflow starts with a manual run in n8n, where a “set fields” step assigns the URL you want to analyze. This keeps it simple when you’re researching one thread at a time.

Comments get collected via Bright Data. Bright Data creates a scrape snapshot for the post, n8n waits briefly, then downloads the results once they’re ready. That wait matters because scrapes aren’t always instant.

Gemini evaluates sentiment. A small batch of comments (limited to five by default) goes into an AI Agent powered by Google Gemini. The agent returns structured sentiment plus reasoning, and an auto-fixing parser helps keep the output consistent when AI responses get slightly messy.

Your Google Sheet becomes the dashboard. Each analyzed comment is appended as a new row, so you build a growing dataset over time. Filters, pivot tables, and charts become effortless because the data is already normalized.

You can easily modify the comment limit to analyze more (or fewer) comments based on your needs. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Manual Trigger

Start the workflow manually to run the Reddit sentiment pipeline on demand.

  1. Add the Manual Execution Start node as your trigger.
  2. Ensure Manual Execution Start is connected to Assign Reddit Link to match the execution flow.

Step 2: Connect Bright Data for Reddit Comment Retrieval

Configure Bright Data to scrape Reddit comments, wait for the snapshot, and download the results.

  1. In Assign Reddit Link, set the post URL field to https://www.reddit.com/r/iphone/comments/1kl0tb5/new_ios_185_update/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button.
  2. Open Bright Data Retrieve Comments and set URLs to =[{"url":"{{ $json["post URL"] }}"}].
  3. Set Resource to webScrapper and select your Bright Data dataset ID in dataset_id.
  4. Credential Required: Connect your brightdataApi credentials in Bright Data Retrieve Comments.
  5. In Pause for Snapshot Ready, keep Unit set to minutes to allow the snapshot to process.
  6. In Bright Data Download Snapshot, set snapshot_id to ={{ $json.snapshot_id }} and keep Resource as webScrapper.
  7. Credential Required: Connect your brightdataApi credentials in Bright Data Download Snapshot.

⚠️ Common Pitfall: Make sure the dataset ID in Bright Data Retrieve Comments is valid, or the snapshot will fail to generate.

Step 3: Limit and Prepare Comments for Analysis

Reduce the number of comments to analyze and prepare them for the AI sentiment evaluator.

  1. In Limit to Five Comments, set maxItems to 5.
  2. Confirm the execution flow is Bright Data Download SnapshotLimit to Five CommentsAI Sentiment Evaluator.

Step 4: Set Up AI Sentiment Analysis

Use Gemini to evaluate sentiment and parse structured results with auto-correction.

  1. In AI Sentiment Evaluator, set Text to =Based on the Reddit post's comment below, decide whether the sentiment is positive, negative or neutral. comment: {{ $json.comment }} Number of upvotes: {{ $json.num_upvotes }}.
  2. Ensure Prompt Type is define and hasOutputParser is enabled.
  3. Connect Gemini Chat Engine as the language model for AI Sentiment Evaluator.
  4. Credential Required: Connect your googlePalmApi credentials in Gemini Chat Engine.
  5. Set up Structured Result Parser with the jsonSchemaExample value shown in the node to enforce structured output.
  6. Connect Structured Result ParserAuto-Correct ParserAI Sentiment Evaluator as the output parser chain.
  7. Connect Secondary Gemini Chat as the language model for Auto-Correct Parser.
  8. Credential Required: Connect your googlePalmApi credentials in Secondary Gemini Chat.

Auto-Correct and Structured Result parsers are AI sub-nodes. Add credentials to Gemini Chat Engine and Secondary Gemini Chat, not to the parser nodes themselves.

Step 5: Configure Google Sheets Output

Append the structured sentiment results to your Google Sheet.

  1. In Append Sentiment to Sheets, set Operation to append.
  2. Select your spreadsheet in documentId and choose sheetName as Sheet1 (gid 0).
  3. Map columns using the defined values: Comment={{ $json.output.comment }}, Sentiment={{ $json.output.sentiment }}, Reason={{ $json.output.reason }}.
  4. Credential Required: Connect your googleSheetsOAuth2Api credentials in Append Sentiment to Sheets.

Step 6: Test and Activate Your Workflow

Validate the end-to-end pipeline and then activate it for ongoing use.

  1. Click Execute Workflow from Manual Execution Start to run the pipeline.
  2. Confirm that Bright Data Retrieve Comments produces a snapshot ID and that Bright Data Download Snapshot returns comment data.
  3. Verify Append Sentiment to Sheets adds new rows with Comment, Sentiment, and Reason values.
  4. Once successful, toggle the workflow to Active for production use.
🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Troubleshooting Tips

  • Google Sheets credentials can expire or use the wrong Google account. If things break, check n8n’s Credentials section and confirm the connected account can edit the target spreadsheet.
  • If you’re using Wait nodes or external scraping, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses or the snapshot download returns “not ready.”
  • Gemini prompts that are too generic lead to “meh” labels. Add a little context (what the product is, what “positive” means to you) or you will be second-guessing the output.

Quick Answers

What’s the setup time for this Reddit sentiment logging automation?

About 30 minutes if your accounts are ready.

Is coding required for this sentiment logging?

No. You’ll connect Bright Data, Gemini, and Google Sheets, then paste a Reddit URL and run it.

Is n8n free to use for this Reddit sentiment logging workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Bright Data scraping costs and Gemini API usage (usually small for short comments).

Where can I host n8n to run this automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

Can I modify this Reddit sentiment logging workflow for different use cases?

Yes, and you probably should. You can increase or remove the “Limit to Five Comments” step to analyze more comments, and you can adjust the AI Agent instructions to match your definition of “positive” (for example, praise for pricing vs. praise for features). Some teams also add extra columns in the Google Sheets step, like product area, competitor name, or thread topic. If you want a hands-off daily run, replace the manual trigger with a webhook or a scheduled trigger and feed it URLs from a list.

Why is my Google Sheets connection failing in this workflow?

Usually it’s the wrong Google account or expired credentials. Reconnect Google Sheets in n8n, then confirm the spreadsheet is shared with that account and you selected the correct file and tab. Also check if the sheet has protected ranges, because appending rows can fail silently when permissions are restricted.

What volume can this Reddit sentiment logging workflow process?

It depends on how many comments you choose to analyze and how fast the scrape snapshot returns, but the default setup processes a small batch quickly.

Is this Reddit sentiment logging automation better than using Zapier or Make?

For this use case, n8n is usually the better fit because scraping + waiting + structured AI parsing is a lot of moving parts. You get more control over the “wait for snapshot” behavior, you can add branching logic without paying per step, and you can self-host if you need higher volume. Zapier and Make can work for simpler “fetch data → write row” tasks, but they get awkward when the data source is inconsistent or asynchronous. Frankly, the AI output parsing is another reason n8n shines here, since you can enforce structure before writing into Sheets. If you’re torn, Talk to an automation expert and we’ll help you pick the simplest option that won’t break next month.

Once this is in place, Reddit stops being a time sink and starts acting like a research feed you can actually use. The workflow does the sorting and labeling so you can focus on what to change next.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Launch login modal Launch register modal