Reddit to Telegram, curated digests with Gemini
You open Reddit to “quickly research” a topic, then surface an hour later with 19 tabs, half-read threads, and no clean takeaway you can actually use.
This Reddit Telegram digest automation hits marketers first, honestly, but content creators and product folks doing customer research feel it too. You get a curated, readable brief in Telegram in about 20 seconds instead of scrolling and second-guessing what matters.
Below you’ll see how the workflow searches Reddit four ways, removes repeats, filters for quality, and has Gemini write a clean digest you can skim on your phone.
How This Automation Works
The full n8n workflow, from trigger to final output:
n8n Workflow Template: Reddit to Telegram, curated digests with Gemini
flowchart LR
subgraph sg0["Telegram Flow"]
direction LR
n0["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/telegram.svg' width='40' height='40' /></div><br/>Send a text message"]
n1["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/telegram.svg' width='40' height='40' /></div><br/>Telegram Trigger"]
n2["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/reddit.svg' width='40' height='40' /></div><br/> Search: Top Posts"]
n3["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/reddit.svg' width='40' height='40' /></div><br/>Search: Hot Posts"]
n4["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/reddit.svg' width='40' height='40' /></div><br/> Search: Relevance"]
n5["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/reddit.svg' width='40' height='40' /></div><br/>Search: new"]
n6["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/merge.svg' width='40' height='40' /></div><br/>Merge All Searches"]
n7["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Remove Duplicates"]
n8@{ icon: "mdi:swap-vertical", form: "rounded", label: "Extract Post Data", pos: "b", h: 48 }
n9@{ icon: "mdi:swap-horizontal", form: "rounded", label: "Filter Quality Posts", pos: "b", h: 48 }
n10@{ icon: "mdi:cog", form: "rounded", label: "Combine for AI", pos: "b", h: 48 }
n11@{ icon: "mdi:robot", form: "rounded", label: "Generate Summary", pos: "b", h: 48 }
n12@{ icon: "mdi:brain", form: "rounded", label: "Google Gemini Chat Model7", pos: "b", h: 48 }
n13@{ icon: "mdi:memory", form: "rounded", label: "Simple Memory1", pos: "b", h: 48 }
n5 --> n6
n10 --> n11
n13 -.-> n11
n11 --> n0
n1 --> n2
n1 --> n3
n1 --> n4
n1 --> n5
n8 --> n9
n7 --> n8
n3 --> n6
n4 --> n6
n2 --> n6
n6 --> n7
n9 --> n10
n12 -.-> n11
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n1 trigger
class n11 ai
class n12 aiModel
class n13 ai
class n9 decision
class n7 code
classDef customIcon fill:none,stroke:none
class n0,n1,n2,n3,n4,n5,n6,n7 customIcon
The Problem: Reddit Research Turns Into Noise
Reddit is incredible for raw, honest language. It’s also a time trap. You search a keyword, sort by “top,” then “hot,” then “new,” then you realize the best post is buried under three duplicates and two link-only threads. You copy a few quotes into a doc, forget the URLs, and later you can’t find the thread again when you need it for a brief or an ad angle. Meanwhile the clock keeps running, and your “research” time crowds out the work that actually ships.
The friction compounds. And it shows up in a few predictable places.
- Searching multiple sorts means you see the same posts again and again, so your attention gets wasted before you even start taking notes.
- Manual filtering is inconsistent, which means you save threads that look interesting but have no substance when you re-open them later.
- Good insights get trapped in scattered screenshots and half-finished docs, so you don’t build a repeatable research habit.
- Without a digest, it’s hard to keep momentum because every new topic restarts the whole “search, scan, decide” loop.
The Solution: Telegram-In, Gemini-Written Reddit Digest-Out
This workflow turns Telegram into your lightweight Reddit research console. You message your bot a keyword (for example, “voice AI agents”), and n8n immediately runs four Reddit searches in parallel using different sorting strategies to capture both evergreen winners and what’s trending. Those results get merged into one stream, then a deduping step removes repeats by comparing post IDs, so you don’t get the same thread showing up twice. Next, it standardizes the fields you actually care about (title, upvotes, subreddit, publish time, URL, and post text), then filters for quality: at least 50 upvotes, real content (not empty selftext), and freshness within the last 15 days. Finally, Gemini reads the cleaned set and writes a Telegram-formatted digest with the top posts and direct links back to Reddit.
It starts with a Telegram message. Reddit gets searched four ways, then the results are cleaned and filtered. Gemini writes the digest, and the workflow sends it straight back to your chat so you can skim it anywhere.
What You Get: Automation vs. Results
| What This Workflow Automates | Results You’ll Get |
|---|---|
|
|
Example: What This Looks Like
Say you do topic research three times a week for content and ads. Manually, it’s common to spend about 10 minutes per Reddit sort (top, hot, new, relevance) plus another 20 minutes cleaning notes, so call it roughly an hour per topic. With this workflow, you send one keyword in Telegram, wait about 10–20 seconds, then skim a 5–7 post digest with links. That’s close to 3 hours back each week, without losing the “real language” you came for.
What You’ll Need
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Telegram for sending keywords and receiving digests
- Reddit API to search posts across sorting methods
- Google Gemini API key (get it from ai.google.dev)
Skill level: Beginner. You’ll connect credentials, paste a bot token, and tweak a couple of filters.
Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).
How It Works
You send a keyword in Telegram. The Telegram trigger captures the message text (your query) and kicks off the workflow immediately.
Reddit gets searched in parallel. n8n runs multiple Reddit searches with different sorts so you don’t miss the “classic” threads and you also catch what’s popping right now.
Results get cleaned and narrowed. Everything is merged, duplicates are removed, and fields are standardized. Then the workflow filters down to higher-signal posts using simple rules like “50+ upvotes” and “posted in the last 15 days.”
Gemini writes the digest and Telegram receives it. The remaining posts are aggregated into one package, Gemini produces a Telegram-formatted summary, and the workflow replies back with a skimmable brief and direct links.
You can easily modify the upvote threshold to be stricter or the time window to be broader based on your needs. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Telegram Trigger
Set up the inbound Telegram trigger that starts the workflow whenever a user sends a message.
- Add the Incoming Telegram Hook node as your trigger.
- In Incoming Telegram Hook, keep Updates set to
message. - Credential Required: Connect your Telegram Bot API credentials in Incoming Telegram Hook (required for the trigger to receive messages).
Step 2: Connect Reddit Data Sources (Parallel Search)
Configure the Reddit search nodes that run simultaneously to gather multiple types of results.
- Connect Incoming Telegram Hook to Fetch Top Reddit Posts, Retrieve Hot Reddit Posts, Find Relevant Reddit Posts, and Collect New Reddit Posts.
- Confirm the parallel flow: Incoming Telegram Hook outputs to both Fetch Top Reddit Posts, Retrieve Hot Reddit Posts, Find Relevant Reddit Posts, and Collect New Reddit Posts in parallel.
- In each Reddit node, set Keyword to
{{$json.message.text}}and Location toallReddit. - Set Operation to
search, and ensure Additional Fields → Sort istopfor Fetch Top Reddit Posts,hotfor Retrieve Hot Reddit Posts,relevancefor Find Relevant Reddit Posts, andnewfor Collect New Reddit Posts. - Credential Required: Connect your Reddit credentials in all four Reddit nodes.
Step 3: Merge, Deduplicate, and Shape the Post Data
Combine results, remove duplicates, and map fields to a clean structure for filtering and AI summarization.
- In Combine Search Results, set Number of Inputs to
4to merge all Reddit branches. - In Eliminate Duplicate Posts, keep the provided JavaScript Code that checks
item.json.idoritem.json.nameto return unique posts. - In Map Post Fields, add assignments for the following fields and values: title =
{{$json.title}}, upvotes ={{$json.ups}}, subreddit_subscribers ={{$json.subreddit_subscribers}}, subreddit ={{$json.subreddit}}, date ={{DateTime.fromSeconds($json.created).toLocaleString()}}, url ={{$json.url}}, content ={{$json.selftext}}. - Connect Combine Search Results → Eliminate Duplicate Posts → Map Post Fields.
Step 4: Filter and Aggregate for AI
Filter out low-quality posts and aggregate remaining data into a single payload for the AI agent.
- In Filter High Quality Posts, set the conditions to: Upvotes
{{$json.upvotes}}is≥ 50, Content{{$json.content}}isnotEmpty, and Date{{$json.date}}is after{{$now.minus({days: 15}).toFormat('M/d/yyyy')}}. - Connect Map Post Fields → Filter High Quality Posts → Aggregate For AI.
- In Aggregate For AI, set Aggregate to
aggregateAllItemData.
Step 5: Set Up AI Summarization
Configure the AI agent to summarize the aggregated Reddit content using Gemini and conversation memory.
- In Compose AI Summary, set Text to
=give summary to the user with links user keyword :- {{ $('Incoming Telegram Hook').item.json.message.text }} reddit content :- {{ $json.data.toJsonString() }}. - Keep the System Message in Compose AI Summary as provided to enforce Telegram-friendly formatting.
- Connect Gemini Chat Model to Compose AI Summary as the language model.
- Connect Conversation Memory to Compose AI Summary as AI memory, with Session Key set to
{{$('Incoming Telegram Hook').item.json.message.from.id}}and Context Window Length set to500. - Credential Required: Connect your Google Gemini credentials in Gemini Chat Model.
- Credential Required: If your AI provider requires credentials for Compose AI Summary, add them there (not on Conversation Memory).
Step 6: Configure Telegram Output
Send the AI-generated summary back to the Telegram user who initiated the request.
- Connect Compose AI Summary → Dispatch Telegram Reply.
- In Dispatch Telegram Reply, set Text to
{{$json.output}}. - Set Chat ID to
{{$('Incoming Telegram Hook').item.json.message.from.id}}. - Credential Required: Connect your Telegram Bot API credentials in Dispatch Telegram Reply.
Step 7: Test & Activate Your Workflow
Validate the end-to-end flow, then enable the workflow for live use.
- Click Execute Workflow and send a test keyword to your Telegram bot.
- Verify that Incoming Telegram Hook receives the message and all four Reddit nodes run in parallel.
- Confirm that Compose AI Summary outputs a formatted summary and Dispatch Telegram Reply sends it back to the same chat.
- Once verified, toggle the workflow to Active for production use.
Common Gotchas
- Reddit OAuth2 credentials can expire or need the right app type. If things break, check your app setup at reddit.com/prefs/apps and re-authorize in n8n first.
- If you’re using Wait nodes or external processing, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- Default prompts in AI nodes are generic. Add your brand voice early or you’ll be editing outputs forever.
Frequently Asked Questions
About 30 minutes if you already have the API keys.
No coding required. You’ll mainly connect Telegram, Reddit, and Gemini credentials, then adjust a couple of filter settings.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Gemini API costs (typically pennies per digest, depending on length).
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Yes, and it’s the first tweak most people make. Adjust the “Filter High Quality Posts” node to raise the minimum upvotes above 50, tighten the “posted within 15 days” rule, or add a subreddit size requirement using the subscriber count field. If you want shorter digests, add a Limit step after the filter and cap it around 10–15 posts. You can also edit the AI Agent’s system message to change tone, structure, and formatting so it matches your brand voice.
Most of the time it’s a bot token issue or the bot chat was never started. Regenerate or re-copy the BotFather token, update the Telegram credentials in n8n, and make sure you’ve sent at least one message to the bot so Telegram “opens” the conversation. If it still fails, check that you’re replying to the same chat ID captured by the trigger.
A lot, as long as your n8n plan and APIs can keep up. On n8n Cloud, the limit is driven by monthly executions (Starter is suitable for light daily use, and higher tiers handle more). If you self-host, there’s no execution cap; it mainly depends on your server and Reddit/Gemini rate limits. Practically, most small teams can run dozens of digests a day without thinking about it.
Usually, yes, because this workflow needs branching, merging, filtering, and a proper AI summarization step, which gets awkward (and pricey) in simpler tools. n8n handles parallel Reddit searches, deduping, and richer logic without punishing you for every extra step. The self-hosted option is a big deal if you plan to run this often. Zapier or Make can still be fine if you only want a single Reddit search and a quick message, with minimal logic. If you’re unsure, Talk to an automation expert and you’ll get a straight answer based on your volume and tools.
Once this is running, Reddit stops being a time sink and starts being a steady stream of usable language. Set it up once, then just ask better questions.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.