Reddit to Telegram, daily content ideas ready to post
You open Reddit “for research” and suddenly it’s 45 minutes later, your tabs are a mess, and you still don’t have one solid post idea worth publishing. Even when you find a good thread, turning it into something usable (with a hook, a point of view, and a draft) becomes a second job.
This Reddit Telegram automation hits marketing leads hardest, but agency owners and founders feel it too. You need a steady stream of relevant topics without living inside comment sections. This workflow turns real community discussions into a daily Telegram briefing with scored opportunities and ready-to-edit post drafts.
Below you’ll see how the automation works, what it replaces, and what you’ll need to run it daily (without becoming “the automation person” on your team).
How This Automation Works
See how this solves the problem:
n8n Workflow Template: Reddit to Telegram, daily content ideas ready to post
flowchart LR
subgraph sg0["Daily Schedule Flow"]
direction LR
n0@{ icon: "mdi:play-circle", form: "rounded", label: "Daily Schedule", pos: "b", h: 48 }
n1["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/reddit.svg' width='40' height='40' /></div><br/>Search AI Automation"]
n2["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/reddit.svg' width='40' height='40' /></div><br/>Search n8n Posts"]
n3["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/reddit.svg' width='40' height='40' /></div><br/>Search AI Business"]
n4@{ icon: "mdi:swap-vertical", form: "rounded", label: "Prepare Focused Data", pos: "b", h: 48 }
n5["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/telegram.svg' width='40' height='40' /></div><br/>Send to Telegram"]
n6["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/merge.svg' width='40' height='40' /></div><br/>Merge"]
n7@{ icon: "mdi:robot", form: "rounded", label: "Text Classifier", pos: "b", h: 48 }
n8@{ icon: "mdi:brain", form: "rounded", label: "OpenRouter Chat Model", pos: "b", h: 48 }
n9@{ icon: "mdi:robot", form: "rounded", label: "AI Agent", pos: "b", h: 48 }
n10@{ icon: "mdi:robot", form: "rounded", label: "AI Agent1", pos: "b", h: 48 }
n11@{ icon: "mdi:robot", form: "rounded", label: "Structured Output Parser", pos: "b", h: 48 }
n12["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/merge.svg' width='40' height='40' /></div><br/>Merge1"]
n13@{ icon: "mdi:swap-vertical", form: "rounded", label: "Prepare Focused Data1", pos: "b", h: 48 }
n14@{ icon: "mdi:swap-vertical", form: "rounded", label: "Edit Fields", pos: "b", h: 48 }
n15@{ icon: "mdi:swap-vertical", form: "rounded", label: "Edit Fields2", pos: "b", h: 48 }
n16@{ icon: "mdi:swap-vertical", form: "rounded", label: "Edit Fields1", pos: "b", h: 48 }
n17["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Code"]
n18["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Code1"]
n17 --> n18
n18 --> n5
n6 --> n7
n12 --> n16
n9 --> n14
n10 --> n15
n14 --> n12
n16 --> n17
n15 --> n12
n0 --> n1
n0 --> n2
n0 --> n3
n7 --> n13
n7 --> n4
n2 --> n6
n3 --> n6
n4 --> n10
n1 --> n6
n8 -.-> n7
n8 -.-> n10
n8 -.-> n11
n8 -.-> n9
n13 --> n9
n11 -.-> n10
n11 -.-> n9
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n0 trigger
class n7,n9,n10,n11 ai
class n8 aiModel
class n17,n18 code
classDef customIcon fill:none,stroke:none
class n1,n2,n3,n5,n6,n12,n17,n18 customIcon
The Challenge: Turning Reddit Noise Into Postable Ideas
Reddit is full of pain, urgency, and real language people actually use. That’s the upside. The downside is the volume. You might scan three subreddits, bookmark a few threads, copy quotes into a doc, then tell yourself you’ll “write later.” Later rarely happens. And when it does, you’re rewriting from scratch because you lost the context, the angle, or the original phrasing that made the thread useful in the first place.
It adds up fast. Here’s where it usually breaks down when you try to do this manually.
- You spend about an hour a day reading threads that never turn into content.
- Good opportunities slip by because you don’t have a consistent system for sorting “question” posts from “request” posts.
- Your idea backlog becomes a pile of links, so you still face a blank page when it’s time to publish.
- When you delegate the research, quality drops because the context and scoring criteria live in someone’s head.
The Fix: Daily Reddit Threads → Telegram Briefing + Draft Posts
This workflow runs on a daily schedule and pulls a small set of high-signal posts from multiple Reddit communities (AI automation, n8n discussions, and entrepreneur threads). Instead of dumping raw links into a spreadsheet, it classifies each post’s intent using AI: is it a question you can teach from, or a request that hints at buying intent? Then it routes each type through the right “analysis brain.” Questions get turned into educational angles and frameworks. Requests get shaped into sales-friendly content that still feels helpful, not pushy. Finally, everything is formatted, scored, and delivered to Telegram as a clean briefing you can skim in minutes.
The workflow starts at 12 PM and pulls up to 5 posts per subreddit (about 15 total). AI classifies and analyzes each post, then generates a handful of LinkedIn and Twitter drafts with relevancy and feasibility scores. Telegram receives the finished package, split into readable chunks so the message doesn’t get truncated.
What Changes: Before vs. After
| What This Eliminates | Impact You’ll See |
|---|---|
|
|
Real-World Impact
Say you monitor 3 subreddits and review about 15 posts a day. If you spend roughly 4 minutes deciding whether each post is useful, that’s about an hour already, and you still need to outline and write. With this workflow, the daily trigger runs automatically, analysis happens in the background, and you receive a Telegram briefing that’s ready to triage in about 5 minutes. Even if you only publish 3 times a week, you’re getting several hours back weekly while staying closer to what your market is literally asking for.
Requirements
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Reddit API for pulling posts from targeted subreddits.
- Telegram Bot to receive the daily briefing in chat.
- OpenRouter API key (get it from your OpenRouter dashboard).
Skill level: Intermediate. You’ll connect a few accounts, paste API keys, and tweak prompts or subreddit lists without touching heavy code.
Need help implementing this? Talk to an automation expert (free 15-minute consultation).
The Workflow Flow
A daily schedule kicks things off. At a set time (configured for 12 PM), n8n pulls a small batch of posts from three chosen subreddits focused on automation and entrepreneurship.
Posts are merged and classified by intent. The workflow combines the streams, then uses an AI text classifier to decide if each post is a “Question” (teaching opportunity) or a “Request” (service or help needed). That single decision changes everything downstream.
Two different AI agents generate the right kind of output. Educational threads go to an insight agent that pulls angles, lessons, and clear frameworks. Service-request threads go to a solution planning agent that focuses on feasibility, positioning, and what a helpful offer could look like.
Telegram gets a formatted briefing. Results are cleaned up into structured fields, assembled into a readable message, split into chunks, then sent via your Telegram bot so it’s easy to skim and forward.
You can easily modify the subreddit list to track your niche, or adjust the scoring criteria to prioritize ideas that match your offer. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Scheduled Pulse Trigger
Set the schedule that starts the Reddit discovery cycle every day.
- Add Scheduled Pulse Trigger to the canvas.
- Open Scheduled Pulse Trigger and set the rule to run at
12(triggerAtHour). - Verify the execution flow: Scheduled Pulse Trigger outputs to Retrieve Automation Posts, Fetch n8n Discussions, and Pull AI Business Threads in parallel.
Tip: Parallel branching means the three Reddit searches run at the same time, which speeds up discovery but can return items out of chronological order.
Step 2: Connect Reddit Sources
Configure the three Reddit streams that feed the idea pipeline.
- Open Retrieve Automation Posts and set Operation to
search, Subreddit toArtificialInteligence, Keyword toAI automation OR workflow automation OR business automation, and Limit to5. - Credential Required: Connect your
redditOAuth2Apicredentials to Retrieve Automation Posts. - Open Fetch n8n Discussions and set Operation to
getAll, Subreddit ton8n, Limit to5, and Filters → Category tohot. - Credential Required: Connect your
redditOAuth2Apicredentials to Fetch n8n Discussions. - Open Pull AI Business Threads and set Operation to
search, Subreddit toentrepreneur, Keyword toAI business OR artificial intelligence business OR automation startup, and Limit to5. - Credential Required: Connect your
redditOAuth2Apicredentials to Pull AI Business Threads. - Connect all three Reddit nodes into Combine Reddit Streams and set Number of Inputs to
3.
⚠️ Common Pitfall: Missing Reddit credentials will cause empty results, which leads to no downstream AI processing.
Step 3: Set Up the Intent Classification and Payload Mapping
Classify each Reddit post as a question or request, then map the text into the AI agents.
- Open Classify Text Intent and set Input Text to
{{ $json.selftext }}. - Confirm the two category definitions are present:
QuesionsandRequests. - Connect Classify Text Intent output 1 to Map Question Payload and output 2 to Map Request Payload.
- In Map Question Payload, set selftext to
{{ $json.selftext }}. - In Map Request Payload, set selftext to
{{ $json.selftext }}.
Step 4: Configure AI Analysis Agents and Parsers
Attach the language model and structured parser to generate analysis and social content ideas.
- Credential Required: Connect your
openRouterApicredentials to OpenRouter Chat Engine. - Ensure OpenRouter Chat Engine is connected as the language model for Classify Text Intent, Insight Analyst Agent, Solution Planner Agent, and Structured Result Parser.
- Open Insight Analyst Agent and set Text to
{{ $json.selftext }}with Prompt Type set todefineand Has Output Parser enabled. - Open Solution Planner Agent and set Text to
{{ $json.selftext }}with Prompt Type set todefineand Has Output Parser enabled. - Open Structured Result Parser and keep Auto Fix enabled with the provided JSON schema example.
Tip: Structured Result Parser is an AI sub-node. Credentials are added to OpenRouter Chat Engine, not the parser itself.
Step 5: Format, Merge, and Finalize Content Fields
Standardize the AI responses and combine the analysis paths into a single dataset.
- In Format Question Output, map Score to
{{ $json.output.relevance_score.score }}, Meaning to{{ $json.output.relevance_score.meaning }}, Summary to{{ $json.output.summary }}, and Social Media Content to{{ $json.output.social_content }}. - In Format Request Output, map Score to
{{ $json.output.relevance_score.score }}, Meaning to{{ $json.output.relevance_score.meaning }}, Summary to{{ $json.output.summary }}, and Social Media Content to{{ $json.output.social_content }}. - Configure Merge Analysis Results with Mode set to
combineand Combine By set tocombineByPosition. - In Finalize Content Fields, set Meaning to
{{ $json.Meaning }}, Summary to{{ $json.Summary }}, and Social Media Content to{{ $json['Social Media Content'] }}.
Step 6: Assemble and Send the Telegram Output
Generate a formatted report, split it into chunks, and send it to Telegram.
- Open Assemble Telegram Message and keep the provided JavaScript to build the consolidated report.
- Open Split Message Chunks and keep maxChunkSize set to
3800for Telegram limits. - Open Dispatch Telegram Alert and set Text to
{{ $json.message }}and Chat ID to[YOUR_ID]. - Credential Required: Connect your
telegramApicredentials to Dispatch Telegram Alert.
⚠️ Common Pitfall: If Chat ID remains [YOUR_ID], the message will fail. Replace it with your numeric Telegram chat ID.
Step 7: Test and Activate Your Workflow
Run a manual test and enable the schedule for production.
- Click Execute Workflow to run Scheduled Pulse Trigger manually.
- Confirm that Combine Reddit Streams returns items from all three sources.
- Verify Assemble Telegram Message outputs a single payload and Split Message Chunks emits multiple messages when needed.
- Check Telegram to confirm Dispatch Telegram Alert delivers the formatted report.
- Toggle the workflow to Active to enable the scheduled daily run.
Watch Out For
- Reddit OAuth credentials can expire or get blocked if permissions are wrong. If posts suddenly stop flowing, check your Reddit app settings at reddit.com/prefs/apps and confirm the credential in n8n matches.
- If you’re using Wait behavior indirectly (for example, Telegram chunking right after a heavy AI call), processing times vary. Bump up the wait duration or add a small delay if downstream nodes fail on empty responses.
- Default prompts in AI nodes are generic. Add your brand voice early or you’ll be editing outputs forever, especially on LinkedIn drafts that tend to sound the same across tools.
Common Questions
About 30 minutes if you already have your API keys.
Yes. You’ll mostly be connecting accounts and pasting credentials. The “hard part” is deciding which subreddits and keywords matter to your business.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in OpenRouter API usage costs (it depends on the model you pick).
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
You can swap the monitored communities by editing the three Reddit retrieval nodes (the ones pulling automation posts, n8n discussions, and AI business threads). Most teams also tweak the keyword filtering and rewrite the two agent prompts so the “Question” and “Request” outputs match their offer, voice, and target platform. If you want the briefing to feed a backlog, add Google Sheets right after “Finalize Content Fields.” And if Telegram isn’t your team’s home base, the same final message can be sent to email via Mailchimp.
Usually it’s a bad bot token or the bot was never added to the chat you’re trying to message. Regenerate the token in BotFather if needed, update the Telegram credential in n8n, then verify the chat ID is correct. One more thing: Telegram can reject very long messages, so keep the chunk-splitting code in place if you expand the briefing format.
It’s designed for about 15 Reddit posts per day out of the box.
Often, yes, because this workflow isn’t just “move data from A to B.” You have branching logic (Questions vs Requests), structured parsing, and multi-step AI generation, which gets expensive or awkward in many no-code tools. n8n also gives you self-hosting, so you’re not paying per tiny step when you iterate on prompts. Zapier or Make can still be fine if you only want a simple daily digest with one AI call. If you’re unsure, Talk to an automation expert and get a recommendation based on your posting volume.
You end up with clearer topics, better drafts, and a daily briefing that doesn’t demand your attention. Set it up once, then let Reddit do the research for you.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.