Reddit to Slack, a clean digest your team will read
Your team keeps saying “we should watch Reddit for this,” then nobody does. Or worse, someone posts a raw link dump in Slack, it gets ignored, and you’re back to tab hopping and half-remembered threads.
This Reddit Slack digest problem hits marketing managers first, because timing matters. But founders and community leads feel it too. You want the signal, not the noise, and you want it where work already happens.
This workflow fetches top Reddit posts, cleans and deduplicates them, uses AI to rank what matters, then posts a readable digest into Slack (and optionally Telegram or Discord). You’ll see how it works, what you need, and where teams usually trip up.
How This Automation Works
The full n8n workflow, from trigger to final output:
n8n Workflow Template: Reddit to Slack, a clean digest your team will read
flowchart LR
subgraph sg0["Scheduled Digest Flow"]
direction LR
n0@{ icon: "mdi:play-circle", form: "rounded", label: "Scheduled Digest Trigger", pos: "b", h: 48 }
n1@{ icon: "mdi:swap-vertical", form: "rounded", label: "Settings Setup", pos: "b", h: 48 }
n2@{ icon: "mdi:swap-vertical", form: "rounded", label: "Build Subreddit Array", pos: "b", h: 48 }
n3@{ icon: "mdi:swap-vertical", form: "rounded", label: "Iterate Subreddit Items", pos: "b", h: 48 }
n4["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>Retrieve Reddit Feed"]
n5["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Clean Post Records"]
n6["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Merge and Filter Posts"]
n7@{ icon: "mdi:robot", form: "rounded", label: "AI Digest Curator", pos: "b", h: 48 }
n8@{ icon: "mdi:brain", form: "rounded", label: "Gemini Chat Model", pos: "b", h: 48 }
n9["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Format Digest Output"]
n10@{ icon: "mdi:swap-horizontal", form: "rounded", label: "Validate Digest Text", pos: "b", h: 48 }
n11["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/telegram.svg' width='40' height='40' /></div><br/>Post to Telegram"]
n12["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/discord.svg' width='40' height='40' /></div><br/>Dispatch to Discord"]
n13["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/slack.svg' width='40' height='40' /></div><br/>Send Slack Update"]
n10 --> n11
n10 --> n12
n10 --> n13
n1 --> n2
n8 -.-> n7
n7 --> n9
n5 --> n6
n2 --> n3
n6 --> n7
n3 --> n4
n4 --> n5
n0 --> n1
n9 --> n10
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n0 trigger
class n7 ai
class n8 aiModel
class n10 decision
class n4 api
class n5,n6,n9 code
classDef customIcon fill:none,stroke:none
class n4,n5,n6,n9,n11,n12,n13 customIcon
The Problem: Reddit Research Turns Into Tab Chaos
Reddit is great for market intel. It’s also exhausting. You start with “just checking a few subreddits,” and suddenly you’re 30 tabs deep, pulling half-relevant posts into a doc, then trying to explain the context to your team in Slack. Even if you do the work, it’s messy: duplicates across subreddits, low-effort meme posts, and titles that don’t tell your team what the thread is actually about. After a week or two, people stop sharing because it’s too much effort to make it readable.
It adds up fast. Here’s where it breaks down.
- Manually checking 6–10 subreddits can easily burn about 2 hours a week, and the cost is usually focus, not just time.
- Low-quality posts and repeated links slip in, which means the digest gets ignored after a few noisy days.
- “Just paste the links” creates zero shared understanding, so decisions still happen off gut feel.
- When one person owns the habit, the whole “Reddit insights” pipeline dies the moment they get busy.
The Solution: An AI-Curated Reddit Digest Posted to Slack
This n8n workflow turns Reddit into a dependable, readable briefing inside Slack. It runs on a schedule (or on demand), pulls top posts from the subreddits you choose using Reddit’s public JSON feed, and then cleans the data so you’re not reviewing junk. Next, it removes duplicates and filters posts using simple rules like minimum upvotes plus your include/exclude keywords. After that, an AI agent ranks what’s left by relevance and generates a formatted digest that your team will actually scan. If the digest text looks valid, it posts the final update to Slack, with the same digest optionally going to Telegram or Discord if you want multiple destinations.
The workflow starts with a scheduled trigger and a settings block where you control subreddits, time filters, keywords, and post counts. Then it fetches, cleans, merges, and hands the shortlist to an AI model (Gemini, OpenAI, or Claude). Finally, the formatted digest is validated and delivered to Slack (plus other channels you enable).
What You Get: Automation vs. Results
| What This Workflow Automates | Results You’ll Get |
|---|---|
|
|
Example: What This Looks Like
Say you monitor 8 subreddits and you usually skim 15 posts per subreddit to find “the good stuff.” If you spend maybe 2 minutes per post including clicks and context switching, that’s about 4 hours of scanning per week. With this workflow, you set a schedule once and let it curate automatically: a minute to adjust settings when needed, a few minutes of AI processing, then the digest lands in Slack ready to read. For many teams, that’s roughly 3 hours back every week, without losing visibility.
What You’ll Need
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Slack for posting the digest to a channel
- Google Gemini or OpenAI to rank and summarize posts
- AI provider API key (get it from your provider’s developer console)
Skill level: Beginner. You’ll connect accounts, paste an API key, and edit a few settings like subreddit names and keywords.
Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).
How It Works
A scheduled run kicks things off. The workflow can run daily, weekly, or on whatever cadence you choose, so the digest becomes a habit without relying on someone remembering.
Your subreddit list is prepared and looped through. A settings node builds an array of subreddits, then the workflow iterates through them so you can monitor many communities without duplicating the whole automation.
Reddit posts are fetched, cleaned, and filtered. n8n pulls the top posts via HTTP Request, then code steps clean up the records, deduplicate overlaps, and apply simple rules like upvote thresholds and keyword includes/excludes.
AI curates what’s worth your team’s attention. The AI agent (paired with a Gemini/OpenAI chat model) ranks the remaining posts by relevance, then generates a concise digest. Another code step formats it so it looks good in chat.
The digest is validated and delivered. If the text passes a quick check (not empty, long enough to be useful), it gets posted to Slack. Telegram and Discord delivery nodes are also there if you want them.
You can easily modify the subreddit list to focus on a new product line or campaign based on your needs. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Schedule Trigger
Set up the workflow to run automatically on a daily schedule.
- Add and open Scheduled Digest Trigger.
- Set the Rule to a cron expression of
0 9 * * *to run daily at 9 AM. - Connect Scheduled Digest Trigger to Settings Setup.
Step 2: Connect Reddit Source Settings
Define subreddit inputs and configure the Reddit API request.
- Open Settings Setup and add the following assignments: subreddits to
AI_Agents,generativeAI,ArtificialInteligence,MachineLearning,OpenAI,ChatGPT, posts_per_subreddit to25, time_filter today, total_posts_in_digest to10, digest_title to🤖 AI Daily Digest, focus_keywords toAI agents, ChatGPT, LLM, machine learning, research, tool, breakthrough, exclude_keywords tocrypto, NFT, political, spam, and min_upvotes to10. - Open Build Subreddit Array and set subreddit_array to
{{ $json.subreddits.split(',').map(s => s.trim()) }}. - Open Iterate Subreddit Items and set subreddit to
{{ $json.subreddit_array[$itemIndex] }}. - Open Retrieve Reddit Feed and set URL to
https://www.reddit.com/r/{{ $json.subreddit }}/top.json?t={{ $('Settings Setup').first().json.time_filter }}&limit={{ $('Settings Setup').first().json.posts_per_subreddit }}. - Enable Send Headers and add the header User-Agent with value
n8n-reddit-automation/1.0. - Credential Required: Connect your httpHeaderAuth credentials in Retrieve Reddit Feed.
⚠️ Common Pitfall: Reddit blocks requests without a valid User-Agent header or proper auth configuration in Retrieve Reddit Feed.
Step 3: Clean and Merge Reddit Posts
Normalize data and apply filtering rules before sending to the AI curator.
- Open Clean Post Records to ensure the JavaScript logic is intact for removing low-quality or removed posts and sorting by score.
- Open Merge and Filter Posts and confirm the logic for deduplication and keyword filters is in place.
- Verify the execution order: Retrieve Reddit Feed → Clean Post Records → Merge and Filter Posts.
Step 4: Set Up AI Digest Generation
Use Gemini to curate and format the digest from filtered posts.
- Open AI Digest Curator and confirm the Text prompt is set to the provided long-form instruction text starting with
You are an expert content curator for Reddit news digests.... - Ensure Gemini Chat Model is connected as the language model for AI Digest Curator; credentials must be added to Gemini Chat Model, not the agent node.
- Credential Required: Connect your Google Gemini credentials in Gemini Chat Model.
- Open Format Digest Output and keep the code that extracts
formatted_outputand addstimestampanddigest_title. - Confirm the execution order: Merge and Filter Posts → AI Digest Curator → Format Digest Output → Validate Digest Text.
⚠️ Common Pitfall: If Gemini Chat Model credentials aren’t set, AI Digest Curator will fail without a clear error in the output.
Step 5: Configure Output Channels
Send the curated digest to Telegram, Discord, and Slack when it passes validation.
- Open Validate Digest Text and verify the condition checks
{{ $json.formatted_output }}is not empty. - Open Post to Telegram and set Text to
{{ $json.formatted_output }}and Chat ID to{{ $('Settings Setup').first().json.telegram_chat_id || '[YOUR_ID]' }}. - Open Dispatch to Discord and set Content to
{{ $json.formatted_output }}with Authentication set towebhook. - Open Send Slack Update and set Text to
{{ $json.formatted_output }}and Select tochannel. - Confirm parallel execution: Validate Digest Text outputs to both Post to Telegram and Dispatch to Discord and Send Slack Update in parallel.
- Credential Required: Add Telegram credentials in Post to Telegram.
- Credential Required: Add Discord webhook credentials in Dispatch to Discord.
- Credential Required: Add Slack credentials in Send Slack Update.
⚠️ Common Pitfall: Leaving [YOUR_ID] in Post to Telegram or Send Slack Update will send nothing.
Step 6: Test & Activate
Validate the entire pipeline before enabling scheduled runs.
- Click Execute Workflow to run a manual test from Scheduled Digest Trigger.
- Check that Retrieve Reddit Feed returns JSON and Format Digest Output outputs a non-empty
formatted_output. - Verify that Telegram, Discord, and Slack each receive the same formatted digest after Validate Digest Text passes.
- When the test succeeds, switch the workflow to Active to allow scheduled execution.
Common Gotchas
- Slack credentials can expire or need specific permissions. If things break, check the Slack app OAuth scopes and token status in your n8n credentials first.
- If you’re using Wait nodes or external processing, run times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- Default prompts in AI nodes are generic. Add your brand voice and “what counts as relevant” early or you will be editing outputs forever.
Frequently Asked Questions
About 30 minutes if your Slack and AI accounts are ready.
No. You’ll mostly paste credentials and edit the subreddit/keyword settings. The code nodes are already built for you.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in AI API costs (Gemini has a free tier, and OpenAI usage is usually a few cents per digest).
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Yes, and it’s honestly one of the best ways to use it. Change the schedule in the Scheduled Digest Trigger, then adjust the AI Digest Curator prompt to produce a “top 5 themes + links” summary instead of individual post bullets. Many teams also tweak the upvote threshold in the filtering logic so the weekly brief stays tight. If you still want daily coverage, send daily digests to a private Slack channel and only post the weekly summary to your main team channel.
Usually it’s expired or missing OAuth permissions. Reconnect Slack in n8n credentials, confirm the app has access to the target channel, and make sure the channel still exists (it happens). If you’re posting to a private channel, the Slack app must be invited to that channel. Rate limiting is rare for one daily digest, but can show up if you test in rapid loops.
A lot more than you’ll want to read, so you’ll usually cap it on purpose.
For this use case, n8n tends to fit better because you can do deduping, branching, and richer formatting without paying extra per “path,” and self-hosting avoids strict task limits. The AI agent approach is also easier to shape when you need ranking plus a clean digest format, not just a summary. Zapier or Make can still work if you only need a simple “fetch and post” flow, but you’ll often rebuild a lot of logic using extra steps. If you’re unsure, Talk to an automation expert and we’ll help you choose quickly.
Once this is running, Reddit stops being a guilty “I’ll check later” tab and turns into a clean briefing your team can act on. Set it up once, then let the workflow do the sorting.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.