Reddit + X to Google Sheets, trends ranked for you
You spot a topic that’s heating up, then you lose an hour bouncing between Reddit tabs, X searches, and half-finished notes. The links get messy. The “insights” turn into screenshots. And by the time you publish, the moment has moved on.
Trend brief automation hits marketers first because speed is the job. But content creators and agency leads feel it too, especially when you need a repeatable way to turn chatter into a weekly plan.
This n8n workflow takes one topic, pulls relevant posts from Reddit and X, scores what’s worth your attention, then writes a ranked trend brief straight into Google Sheets. You’ll see what it does, what you need, and what results to expect.
How This Automation Works
The full n8n workflow, from trigger to final output:
n8n Workflow Template: Reddit + X to Google Sheets, trends ranked for you
flowchart LR
subgraph sg0["Form Intake Flow"]
direction LR
n0["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/form.svg' width='40' height='40' /></div><br/>Form Intake Trigger"]
n1["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>External API Fetch"]
n2["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/x.dark.svg' width='40' height='40' /></div><br/>Search Social Posts"]
n3["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Parse Reddit Results"]
n4@{ icon: "mdi:cog", form: "rounded", label: "Collect Reddit Items", pos: "b", h: 48 }
n5@{ icon: "mdi:swap-vertical", form: "rounded", label: "Map Reddit Fields", pos: "b", h: 48 }
n6@{ icon: "mdi:swap-vertical", form: "rounded", label: "Map Social Fields", pos: "b", h: 48 }
n7@{ icon: "mdi:cog", form: "rounded", label: "Collect Social Items", pos: "b", h: 48 }
n8["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/merge.svg' width='40' height='40' /></div><br/>Combine Sources"]
n9@{ icon: "mdi:robot", form: "rounded", label: "Keyword Expansion Agent", pos: "b", h: 48 }
n10@{ icon: "mdi:robot", form: "rounded", label: "Keyword Output Parser", pos: "b", h: 48 }
n11@{ icon: "mdi:brain", form: "rounded", label: "Gemini Parser Model", pos: "b", h: 48 }
n12@{ icon: "mdi:brain", form: "rounded", label: "Gemini Keyword Model", pos: "b", h: 48 }
n13@{ icon: "mdi:swap-vertical", form: "rounded", label: "Iterate Keywords", pos: "b", h: 48 }
n14["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Extract Keywords"]
n15@{ icon: "mdi:swap-vertical", form: "rounded", label: "Unnest Keyword Data", pos: "b", h: 48 }
n16@{ icon: "mdi:robot", form: "rounded", label: "Content Insight Agent", pos: "b", h: 48 }
n17@{ icon: "mdi:swap-vertical", form: "rounded", label: "Iterate Content Items", pos: "b", h: 48 }
n18@{ icon: "mdi:brain", form: "rounded", label: "Gemini Insight Model", pos: "b", h: 48 }
n19@{ icon: "mdi:robot", form: "rounded", label: "Insight Output Parser", pos: "b", h: 48 }
n20@{ icon: "mdi:brain", form: "rounded", label: "Gemini Parser Model 2", pos: "b", h: 48 }
n21@{ icon: "mdi:robot", form: "rounded", label: "Trend Synthesis Agent", pos: "b", h: 48 }
n22@{ icon: "mdi:brain", form: "rounded", label: "Gemini Strategy Model", pos: "b", h: 48 }
n23["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Normalize Agent Output"]
n24@{ icon: "mdi:cog", form: "rounded", label: "Placeholder Extra Source", pos: "b", h: 48 }
n25@{ icon: "mdi:cog", form: "rounded", label: "Placeholder Source", pos: "b", h: 48 }
n3 --> n5
n14 --> n13
n8 --> n13
n16 --> n17
n9 --> n14
n21 --> n23
n4 --> n8
n15 --> n17
n7 --> n8
n5 --> n4
n6 --> n7
n1 --> n3
n2 --> n6
n13 --> n15
n13 --> n1
n13 --> n2
n13 --> n24
n13 --> n25
n17 --> n21
n17 --> n16
n0 --> n9
n18 -.-> n16
n10 -.-> n9
n11 -.-> n10
n12 -.-> n9
n20 -.-> n19
n22 -.-> n21
n19 -.-> n16
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n0 trigger
class n9,n10,n16,n19,n21 ai
class n11,n12,n18,n20,n22 aiModel
class n1 api
class n3,n14,n23 code
classDef customIcon fill:none,stroke:none
class n0,n1,n2,n3,n8,n14,n23 customIcon
The Problem: Trend Research Is Scattered and Hard to Trust
Manual trend research looks simple until you do it consistently. You start with “Let’s research one topic,” then you’re juggling Reddit searches, X queries, engagement numbers, and a bunch of posts that are kind of relevant but not usable. You paste links into a doc, forget where they came from, and end up rewriting the same summary every week. The worst part is the decision fatigue: you’re staring at 40 posts, trying to guess what will matter to your audience, with no structure and no scoring.
It adds up fast. Here’s where it breaks down when you try to scale this beyond “I’ll just do it myself.”
- Searching Reddit and X separately turns one topic into about 30 minutes of tab hopping.
- Engagement signals are easy to misread when you’re comparing posts across platforms by eye.
- Notes don’t become a backlog, so the same research gets repeated next week.
- Without grouping and ranking, you end up with “interesting links” instead of publishable angles.
The Solution: One Topic In, Ranked Trend Brief Out
This workflow starts with a simple form where you enter a single topic (like “email deliverability” or “AI note-taking”). An AI agent expands that topic into subtopics and keywords, then n8n automatically searches Reddit and X for posts that match. For each post, it captures the parts you actually need later: title/text, engagement metrics, and the original link. Next, an AI model analyzes every item and labels it with trend potential, audience relevance, platform fit, recommended formats, categories, and keywords. Finally, a synthesis agent groups similar findings, ranks the strongest opportunities, and outputs a clean brief into Google Sheets so your team has one place to work from.
The flow is straightforward. A form submission triggers keyword expansion, then the workflow collects content from Reddit and X in parallel. After AI scoring and grouping, you get a ranked, structured trend brief in Sheets, ready for content planning or client reporting.
What You Get: Automation vs. Results
| What This Workflow Automates | Results You’ll Get |
|---|---|
|
|
Example: What This Looks Like
Say you research 3 topics per week for your content calendar. Manually, a “quick” pass is often 30 minutes on Reddit plus 30 minutes on X, then another 30 minutes to summarize and organize, which is roughly 4 to 5 hours a week. With this workflow, entering each topic takes about 2 minutes, and the automated collection plus AI analysis runs in the background while you do other work. You still review the final sheet, but it’s a focused skim, not a scavenger hunt.
What You’ll Need
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Google Sheets for storing the ranked trend brief
- X (Twitter) API access to search and pull posts
- Google Gemini (PaLM) API key (get it from Google AI Studio / Google Cloud)
Skill level: Intermediate. You’ll connect a few credentials, make sure the form trigger is accessible, and tweak prompts if you want different outputs.
Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).
How It Works
A form submission kicks it off. You enter a single topic in the n8n Form Trigger, which keeps the input simple enough for anyone on your team to use.
AI expands your topic into keywords. The keyword expansion agent generates subtopics and search terms, then parses them into a structured list so n8n can iterate cleanly without you copying anything.
Reddit and X content gets collected and normalized. For each keyword, the workflow fetches Reddit results via HTTP requests and searches X via OAuth. It maps fields into a consistent format (text, title, link, engagement), then merges both sources into one stream.
Every item is scored, then summarized into a trend brief. The content insight agent analyzes posts one by one, and the trend synthesis agent groups similar insights, ranks opportunities, and produces strategic content recommendations that are easy to paste into a plan.
You can easily modify the sources (for example, adding YouTube or Hacker News) to match where your audience actually spends time. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Form Trigger
Set up the intake form that starts the workflow and captures the research topic.
- Add and open Form Intake Trigger.
- Set Form Title to
topic. - Set Form Description to
Auto-Research Assistant for Market Trends. - Confirm the form field label is
topicwith placeholderone word is enough.
Step 2: Set Up Keyword Expansion and Parsing
This step expands the user topic into subtopics and structured keyword output using the AI stack.
- Open Keyword Expansion Agent and keep the Text prompt as provided, including the expression
{{ $json.topic }}references. - Confirm Keyword Expansion Agent has Has Output Parser enabled.
- Open Keyword Output Parser and set Auto Fix to
true. - Set JSON Schema Example in Keyword Output Parser to the provided schema block.
- Credential Required: Connect your
googlePalmApicredentials to Gemini Keyword Model. - Credential Required: Connect your
googlePalmApicredentials to Gemini Parser Model. - Ensure Gemini Keyword Model is connected as the language model for Keyword Expansion Agent.
- Ensure Gemini Parser Model is connected as the language model for Keyword Output Parser (credentials are added to Gemini Parser Model, not the parser).
Step 3: Fetch Reddit and Social Content for Each Keyword
This step iterates keywords, pulls Reddit and X content, and normalizes fields for aggregation.
- Open Extract Keywords and keep the JavaScript that flattens
$json.output.subtopicsinto individual keyword items. - Use Iterate Keywords to batch keywords; connect its first output to Unnest Keyword Data.
- In Unnest Keyword Data, set Field to Split Out to
data[0].keywordand Include toallOtherFields. - From Iterate Keywords second output, connect to both External API Fetch and Search Social Posts in parallel, plus the placeholders.
- In External API Fetch, set URL to
=https://www.reddit.com/search.json?q={{ $json.keyword }}and add query parameterlimit=5. - In Search Social Posts, set Operation to
search, Limit to10, Search Text to={{ $json.topic }}, and Sort Order torecency. - Credential Required: Connect your
twitterOAuth2Apicredentials to Search Social Posts. - Keep Parse Reddit Results JavaScript as provided to map Reddit fields into clean JSON.
- In Map Reddit Fields, set source to
redditand keyword to={{ $('Extract Keywords').item.json.keyword }}, with Include Other Fields enabled. - In Map Social Fields, set source to
x (formally twitter), text to={{ $json.text }}, and keyword to={{ $('Extract Keywords').item.json.keyword }}. - Aggregate each stream using Collect Reddit Items and Collect Social Items, then merge in Combine Sources.
={{ $json.topic }} is populated from the form.Step 4: Analyze Content Items with AI Insights
This step scores each content item and produces structured insights for trend synthesis.
- Route Combine Sources into Iterate Keywords and then into Unnest Keyword Data as shown in the workflow.
- From Unnest Keyword Data, feed Iterate Content Items to batch content items.
- Open Content Insight Agent and keep the full prompt, including the expression
{{ $json.data[0].title }}and the keyword reference{{ $('Extract Keywords').item.json.keyword }}. - Ensure Insight Output Parser is connected to Content Insight Agent for structured output; credentials are added to the model node, not the parser.
- Credential Required: Connect your
googlePalmApicredentials to Gemini Insight Model. - Credential Required: Connect your
googlePalmApicredentials to Gemini Parser Model 2. - Ensure Gemini Insight Model is connected as the language model for Content Insight Agent.
- Ensure Gemini Parser Model 2 is connected as the language model for Insight Output Parser.
Step 5: Synthesize Trends and Normalize Output
This step consolidates insights into final trend recommendations and cleans the AI output.
- From Iterate Content Items, send items to Trend Synthesis Agent.
- Keep the Trend Synthesis Agent prompt intact, including
{{ JSON.stringify($json) }}for batch processing. - Credential Required: Connect your
googlePalmApicredentials to Gemini Strategy Model. - Ensure Gemini Strategy Model is connected as the language model for Trend Synthesis Agent.
- In Normalize Agent Output, keep the JavaScript that removes code fences and parses JSON output.
Step 6: Test & Activate Your Workflow
Run a full test to confirm data collection, AI analysis, and final trend output are working end-to-end.
- Click Execute Workflow and submit a sample topic using Form Intake Trigger.
- Verify External API Fetch and Search Social Posts return data, and that Combine Sources merges both streams.
- Check that Content Insight Agent outputs structured insight objects parsed by Insight Output Parser.
- Confirm Normalize Agent Output outputs clean JSON with
final_trendsandstrategy_recommendations. - When successful, toggle the workflow to Active for production use.
Common Gotchas
- X (Twitter) OAuth2 credentials can expire or lack the right scopes. If things break, check your n8n Credentials panel and the X developer app permissions first.
- If you’re using Wait nodes or external processing, run times vary. Bump up the wait duration if downstream AI nodes fail because they received an empty or partial batch.
- Gemini prompts start generic, honestly. Add brand voice, audience, and “what good looks like” near the Keyword Expansion Agent and Content Insight Agent or you will be editing the brief every time.
Frequently Asked Questions
About 30–60 minutes if your Gemini, X, and Google accounts are ready.
No. You’ll mostly connect accounts and paste in API credentials. The only “techy” part is checking that the form trigger URL is accessible.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Gemini API costs, which depend on usage and model selection.
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Yes, and it’s a common tweak. You would replace the Google Sheets output step with a Notion database create/update action, keeping the same normalized fields produced after the Trend Synthesis Agent. Many teams also add a “content status” field (Drafting, Editing, Scheduled) so the brief becomes a working pipeline. If you want both, you can write to Sheets for reporting and Notion for execution.
Usually it’s expired OAuth credentials or missing scopes on the X developer app. Reconnect the X credential inside n8n, then re-run a single keyword to test. If you’re pulling lots of posts in one run, it can also be rate limiting, so reducing keywords per run often stabilizes it.
A lot, but it depends on your n8n plan, your server, and your API limits.
For this use case, n8n tends to win when you need batching, merging two sources, and multi-step AI analysis in one run. Zapier and Make can do parts of it, but the logic gets fiddly once you add “expand keywords, loop, score each item, then synthesize.” n8n also gives you a self-host option, which matters if you’re running large research batches. That said, if you only want a simple “new Reddit post → append to sheet,” other tools can be quicker to start. Talk to an automation expert if you’re not sure which fits.
Set this up once and your research stops living in browser tabs. The workflow handles the messy collection and ranking so you can focus on what to publish next.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.