Reddit to Google Sheets, leads captured and cleaned
You find a perfect Reddit thread, open it “for later,” and then it’s gone under 200 new posts. Or you do copy-paste into a spreadsheet, but the formatting is a mess and you never capture the context that makes the lead actually usable.
Reddit lead capture hits marketers doing outbound research hard. A founder trying to land early customers feels it too. Same for a freelancer hunting gigs without living on Reddit all day.
This n8n workflow pulls fresh posts from your chosen subreddits every 2 hours, filters for real opportunities, grabs the top-level comments (minus mod noise), then appends clean rows into Google Sheets. You’ll see exactly what it automates, what you get back, and what to watch out for.
How This Automation Works
The full n8n workflow, from trigger to final output:
n8n Workflow Template: Reddit to Google Sheets, leads captured and cleaned
flowchart LR
subgraph sg0["Schedule Flow"]
direction LR
n0@{ icon: "mdi:swap-horizontal", form: "rounded", label: "Hiring-related Comments", pos: "b", h: 48 }
n1@{ icon: "mdi:play-circle", form: "rounded", label: "Schedule Trigger", pos: "b", h: 48 }
n2["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>HTTP Request"]
n3@{ icon: "mdi:swap-vertical", form: "rounded", label: "Extract Post Metadata", pos: "b", h: 48 }
n4@{ icon: "mdi:swap-vertical", form: "rounded", label: "Seperate array into individu..", pos: "b", h: 48 }
n5["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>HTTP Request1"]
n6@{ icon: "mdi:swap-vertical", form: "rounded", label: "Extract Post Metadata1", pos: "b", h: 48 }
n7@{ icon: "mdi:swap-vertical", form: "rounded", label: "Seperate array into individu..", pos: "b", h: 48 }
n8["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>HTTP Request2"]
n9@{ icon: "mdi:swap-vertical", form: "rounded", label: "Extract Post Metadata2", pos: "b", h: 48 }
n10@{ icon: "mdi:swap-vertical", form: "rounded", label: "Seperate array into individu..", pos: "b", h: 48 }
n11["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>HTTP Request3"]
n12@{ icon: "mdi:swap-vertical", form: "rounded", label: "Extract Post Metadata3", pos: "b", h: 48 }
n13@{ icon: "mdi:swap-vertical", form: "rounded", label: "Seperate array into individu..", pos: "b", h: 48 }
n14["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>HTTP Request4"]
n15@{ icon: "mdi:swap-vertical", form: "rounded", label: "Extract Post Metadata4", pos: "b", h: 48 }
n16@{ icon: "mdi:swap-vertical", form: "rounded", label: "Seperate array into individu..", pos: "b", h: 48 }
n17["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>HTTP Request5"]
n18@{ icon: "mdi:swap-vertical", form: "rounded", label: "Extract Post Metadata5", pos: "b", h: 48 }
n19@{ icon: "mdi:swap-vertical", form: "rounded", label: "Seperate array into individu..", pos: "b", h: 48 }
n20["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/merge.svg' width='40' height='40' /></div><br/>Merge"]
n21["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>HTTP Request6"]
n22@{ icon: "mdi:swap-vertical", form: "rounded", label: "Extract Post Metadata6", pos: "b", h: 48 }
n23@{ icon: "mdi:swap-vertical", form: "rounded", label: "Seperate array into individu..", pos: "b", h: 48 }
n24["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>HTTP Request7"]
n25@{ icon: "mdi:swap-vertical", form: "rounded", label: "Extract Post Metadata7", pos: "b", h: 48 }
n26@{ icon: "mdi:swap-vertical", form: "rounded", label: "Seperate array into individu..", pos: "b", h: 48 }
n27["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/reddit.svg' width='40' height='40' /></div><br/>Get many comments FROM multi.."]
n28@{ icon: "mdi:swap-horizontal", form: "rounded", label: "Remove Mod Comments", pos: "b", h: 48 }
n29@{ icon: "mdi:database", form: "rounded", label: "Get present leads", pos: "b", h: 48 }
n30["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Filter Unique Leads"]
n31@{ icon: "mdi:database", form: "rounded", label: "Add Leads to Google Sheet", pos: "b", h: 48 }
n20 --> n27
n2 --> n3
n5 --> n6
n8 --> n9
n11 --> n12
n14 --> n15
n17 --> n18
n21 --> n22
n24 --> n25
n1 --> n2
n1 --> n5
n1 --> n8
n1 --> n11
n1 --> n14
n1 --> n17
n1 --> n21
n1 --> n24
n29 --> n30
n30 --> n31
n28 --> n0
n3 --> n4
n6 --> n7
n9 --> n10
n12 --> n13
n15 --> n16
n18 --> n19
n22 --> n23
n25 --> n26
n0 --> n29
n4 --> n20
n27 --> n28
n7 --> n20
n10 --> n20
n13 --> n20
n16 --> n20
n19 --> n20
n23 --> n20
n26 --> n20
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n1 trigger
class n0,n28 decision
class n29,n31 database
class n2,n5,n8,n11,n14,n17,n21,n24 api
class n30 code
classDef customIcon fill:none,stroke:none
class n2,n5,n8,n11,n14,n17,n20,n21,n24,n27,n30 customIcon
The Problem: Reddit leads slip through the cracks
Reddit is great for finding people who are actively asking for help, hiring, or looking for recommendations. It’s also chaotic. Threads move fast, and the “real” lead is often hidden in the first few comments, not the post itself. Manually tracking all of that turns into a daily scavenger hunt: open five tabs, skim, copy text, clean it, paste it, find the URL again, then try to remember why it mattered. Miss a couple hours and you miss the moment.
It adds up fast. Here’s where it breaks down in real life.
- You end up checking the same subreddits repeatedly because there’s no reliable “latest leads” list.
- Copy-pasting comments destroys formatting, and later you can’t tell what was a quote vs. the actual request.
- Moderator replies and auto-messages fill your notes, so the spreadsheet grows while lead quality drops.
- Without deduplication, you log the same thread multiple times and waste outreach time on repeats.
The Solution: Reddit leads logged to Sheets automatically
This workflow runs on a schedule (every 2 hours) and uses Reddit’s OAuth connection to pull the latest posts from subreddits you choose. It then filters for posts that match the signals you care about, like mentions of freelance work, gigs, or n8n-related needs. For each relevant post, it fetches the comment thread, keeps only the top-level comments (so you get direct replies, not deep rabbit holes), and removes moderator comments that usually don’t help you qualify a lead. Finally, it formats everything into consistent fields and appends a clean row to your Google Sheet, including the subreddit, title, post URL, comment body, username, and timestamp.
The workflow starts with a cron trigger and a set of Reddit API requests that pull up to 10 recent posts per subreddit stream. After merging those streams, it collects comments, excludes mod replies, checks your existing sheet to prevent duplicates, then appends only net-new leads.
What You Get: Automation vs. Results
| What This Workflow Automates | Results You’ll Get |
|---|---|
|
|
Example: What This Looks Like
Say you track 8 subreddits and manually review 10 new posts each. If it takes maybe 2 minutes per post to skim, open comments, and decide “lead or not,” that’s about 160 minutes per sweep. Do that twice a day and you’re near 5 hours a week, and you still forget to log details. With this workflow running every 2 hours, your “time spent” becomes a quick 10-minute review of the sheet and picking who to contact.
What You’ll Need
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Reddit app (OAuth2) to read posts and comments.
- Google Sheets to store leads in a clean table.
- Google account access (authorize in n8n credentials).
Skill level: Intermediate. You’ll connect OAuth credentials and match spreadsheet columns, but you won’t be writing new code from scratch.
Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).
How It Works
A scheduled check runs in the background. The cron trigger fires every 2 hours, so you get fresh threads without remembering to look.
Reddit posts are pulled and shaped into usable fields. Multiple HTTP requests fetch recent posts, then “Set” steps map details like title, URL, subreddit, and timestamps so everything stays consistent downstream.
Only the posts worth your time continue. IF filters remove noise, then the workflow fetches comments for the remaining posts, keeps top-level replies, and excludes moderator comments that usually add clutter instead of context.
Your Google Sheet stays clean. The workflow reads existing rows, deduplicates in code, and appends only new lead records into a pre-created sheet with matching column names.
You can easily modify the subreddit list and keyword filters to match your niche based on your needs. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Scheduled Trigger
Set up the workflow schedule so all API requests start automatically on your preferred cadence.
- Add and open Scheduled Automation Trigger.
- Configure the schedule timing to match your lead-capture frequency (daily, hourly, etc.).
- Confirm that Scheduled Automation Trigger outputs to eight parallel request nodes.
Step 2: Connect the Parallel API Requests
Configure all HTTP request nodes that fetch posts in parallel batches.
- Open Primary API Request and define the target endpoint and request method.
- Repeat the same setup for Secondary API Request, Third API Request, Fourth API Request, Fifth API Request, Sixth API Request, Seventh API Request, and Eighth API Request.
- Ensure each request returns a list of posts suitable for the downstream mapping nodes.
Step 3: Map and Split Post Batches
Normalize post data and split each response into individual items for processing.
- Configure Map Post Details to set the fields you need from Primary API Request.
- Repeat field mapping for Map Post Details B through Map Post Details H to keep field names consistent across batches.
- Ensure each mapping node connects to its respective split node: Split Posts Batch A through Split Posts Batch H.
- Verify all Split Posts Batch nodes output to Combine Post Streams.
Step 4: Combine Streams and Fetch Comments
Merge all post streams, then fetch comments across the combined posts.
- Open Combine Post Streams and confirm it receives inputs from all eight split nodes.
- Configure Fetch Multi-Post Comments to pull comments for each merged post.
- Connect Combine Post Streams → Fetch Multi-Post Comments as shown in the workflow.
Step 5: Filter and Qualify Leads
Filter out moderator replies and narrow results to hiring-related comments.
- Configure Exclude Moderator Replies to identify and remove moderator content.
- Set up Hiring Comment Filter to pass only hiring-related comments forward.
- Ensure the flow is Fetch Multi-Post Comments → Exclude Moderator Replies → Hiring Comment Filter.
Step 6: Connect Google Sheets and Deduplicate
Retrieve existing leads, remove duplicates, and prepare clean records for storage.
- Open Retrieve Existing Leads and select your spreadsheet and worksheet.
- Configure Deduplicate Lead Records to compare incoming leads against existing rows.
- Confirm the flow is Hiring Comment Filter → Retrieve Existing Leads → Deduplicate Lead Records.
Step 7: Configure the Output Destination
Append the deduplicated leads into your Google Sheet.
- Open Append Leads to Sheet and choose the same spreadsheet destination.
- Map the incoming fields from Deduplicate Lead Records to the target columns.
- Ensure Deduplicate Lead Records outputs directly to Append Leads to Sheet.
Step 8: Test and Activate Your Workflow
Run an end-to-end test to confirm lead capture, filtering, and storage.
- Click Execute Workflow and inspect outputs from Combine Post Streams, Fetch Multi-Post Comments, and Append Leads to Sheet.
- Confirm that Exclude Moderator Replies and Hiring Comment Filter only pass valid hiring leads.
- Verify that new leads are appended in Append Leads to Sheet and that duplicates are removed.
- Switch the workflow to Active to run on schedule.
Common Gotchas
- Reddit OAuth credentials can expire or lack the right scopes. If requests start failing, check your Reddit app settings and the n8n Credentials screen first.
- If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- Google Sheets appends can go sideways when column names don’t match exactly. Make sure your header row matches the workflow’s fields before you run it on a schedule.
Frequently Asked Questions
About 30 minutes if your Reddit and Google accounts are ready.
No. You’ll connect accounts and confirm a few fields in Google Sheets.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. Reddit and Google Sheets can be used on free tiers for most small teams.
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Yes, and you’ll want to. Update the subreddit targets in the HTTP Request nodes that fetch posts, then adjust the IF filters (and the “Hiring Comment Filter”) to match phrases you actually care about. Common tweaks include focusing on “hiring” flairs, excluding certain authors, and logging extra fields like score or comment count.
Most of the time it’s expired OAuth access or a Reddit app misconfiguration. Reconnect the Reddit credential in n8n, then confirm your Reddit app is still active and allowed to access the endpoints you’re calling. Also check rate limiting if you expanded to lots of subreddits, because multiple HTTP requests can spike usage. Frankly, one small permissions change on the Reddit side can break a previously “working” flow.
Plenty for most prospecting routines: it pulls the latest 10 posts per request stream on each run, every 2 hours, and logs only what passes your filters. On n8n Cloud Starter you’re capped by monthly executions, while self-hosting is mostly limited by your server and how aggressively you hit the Reddit API.
Yes, but it depends on how picky you are. This flow uses multi-step filtering, comment processing, and deduplication, which is where n8n tends to feel simpler (and cheaper) than building a long Zapier chain. Zapier or Make can be fine for a basic “new post to sheet,” but grabbing top-level comments and excluding mod replies usually turns into extra steps fast. n8n also gives you the self-hosted option, so you’re not watching task limits every week. If you want help choosing, Talk to an automation expert.
Once this is running, your spreadsheet becomes the habit, not Reddit tabs. The workflow handles the repetitive capture and cleanup so you can focus on outreach that actually closes.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.