🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 22, 2026

Reddit + Supabase: searchable summaries from saves

Lisa Granqvist Partner Workflow Automation Expert

Saving great Reddit threads is easy. Finding them again, understanding why you saved them, and pulling the useful bits out later is the part that breaks.

If you do market research for a living, content curators feel this daily. Community managers run into it mid-sprint when feedback is scattered. And founders trying to make faster product calls usually end up re-reading the same long threads. This Reddit Supabase summaries automation turns saved posts into a searchable knowledge base you can actually use.

You’ll see what the workflow stores, how the AI filtering works, and how to keep duplicates and noise out of your database.

How This Automation Works

The full n8n workflow, from trigger to final output:

n8n Workflow Template: Reddit + Supabase: searchable summaries from saves

The Problem: Saved Reddit becomes a messy “later” pile

Reddit is a goldmine until it turns into a junk drawer. You save a post because it has a killer framework, a customer quote, or a surprisingly honest comment chain. Then two weeks later you can’t remember what you saved, what it was about, or which parts mattered. So you open the thread again, skim the post, scroll through comments, get distracted, and repeat the cycle. Multiply that by a few saves per day and you’ve got hours lost to re-reading instead of acting on insights.

It’s not one big mistake. It’s dozens of tiny ones that compound.

  • You re-skim long threads just to extract one or two actionable takeaways.
  • Useful posts disappear in a saved list with no tags, no summaries, and no way to search by theme.
  • Manual copy-paste into docs or spreadsheets leads to missing context and inconsistent formatting.
  • Duplicates sneak in because you forgot you already captured that discussion last month.

The Solution: Save posts on Reddit, get searchable Supabase summaries

This workflow takes the “I’ll read it later” habit you already have and turns it into a structured research system. Once per day (or on-demand), n8n pulls your saved Reddit posts through the Reddit API, filters them by subreddits you care about, and removes anything you’ve already stored in Supabase. Next comes the quality control: an LLM check (Gemini in this build) reads the content against your plain-English criteria, so you’re not stuck with off-topic posts that only share a keyword. For posts that pass, the workflow fetches the comment section, extracts the most relevant parts, and generates a combined summary. Finally, it inserts a clean record into Supabase with the Reddit ID, title, URL, tags, post date, upvotes, comment count, and a single summary you can search later.

The workflow starts with your daily schedule trigger, grabs what’s new, then batches through posts with a small wait for rate control. After the AI confirms relevance, the summarizer compiles the post plus top comments, formats the fields, and stores everything in Supabase as one consistent entry.

What You Get: Automation vs. Results

Example: What This Looks Like

Say you save 10 Reddit posts in a week for product research. Manually, you might spend about 10 minutes per post to re-open it, skim, pull quotes from comments, and paste into a doc, which is roughly 1.5 hours. Then you still have to make it searchable. With this workflow, you save posts as usual, the daily run processes them in batches, and you spend maybe 10 minutes total reviewing the final Supabase entries. That’s over an hour back each week, and your notes are structured from day one.

What You’ll Need

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • Reddit for accessing your saved posts via API.
  • Supabase to store a searchable research table.
  • Google Gemini API key (get it from Google AI Studio).

Skill level: Intermediate. You’ll connect accounts, paste API keys, and adjust a few filters and prompts safely.

Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).

How It Works

A daily run pulls new saved posts. The schedule trigger kicks off once per day (there’s also a manual trigger for testing), then the workflow starts by reading what’s already in Supabase so it knows what you’ve captured before.

Duplicates and irrelevant posts are removed early. n8n collects Reddit IDs from your Supabase table, retrieves your saved posts from Reddit, then runs code-based filters (subreddits, basic rules) and an “exclude existing posts” check so you don’t store the same thread twice.

AI checks relevance, then summarization begins. Posts move through a batch loop with a short wait for rate control. Gemini evaluates your natural language criteria, an If condition decides what passes, and only then does the workflow fetch comments and extract details for summarization.

Structured entries land in Supabase. The post and comment highlights get combined, summarized, parsed into consistent fields, and formatted for insertion. At the end you have one record per post with title, URL, tags, summary, post date, upvotes, and comment count.

You can easily modify the subreddit and criteria filters to match your niche based on your needs. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Trigger Type

This workflow supports both manual runs and a scheduled daily run to fetch and process Reddit content.

  1. Add and position Manual Run Trigger for on-demand tests.
  2. Add and configure Daily Schedule Trigger for automated runs (leave the default rule or define your interval).
  3. Connect both Manual Run Trigger and Daily Schedule Trigger to Fetch Posts from Supabase.

Step 2: Connect Supabase and Reddit Data Sources

These nodes pull existing entries from Supabase, then retrieve saved Reddit posts for processing.

  1. Open Fetch Posts from Supabase and set Table to reddit_posts, Operation to getAll, and Return All to true.
  2. Credential Required: Connect your supabaseApi credentials in Fetch Posts from Supabase.
  3. Open Retrieve Saved Reddit Posts and set Resource to profile, Details to saved, and Limit to 10.
  4. Credential Required: Connect your redditOAuth2Api credentials in Retrieve Saved Reddit Posts.
  5. Leave Collect Reddit IDs as-is to extract reddit_id values for de-duplication.

⚠️ Common Pitfall: If your Supabase table has no reddit_id column, Collect Reddit IDs will return empty values and duplicates won’t be excluded.

Step 3: Filter, De-duplicate, and Batch Posts

These nodes filter by subreddit rules, remove already-stored posts, and batch remaining items to control execution.

  1. In Filter Posts by Subreddit, update acceptedSubReddits or subredditKeywords in the JavaScript to match your target communities.
  2. Keep Exclude Existing Posts as-is to compare new post IDs against Collect Reddit IDs results.
  3. Use Batch Through Posts to process one post per cycle, then loop back after Delay for Rate Control.
  4. Ensure the flow follows: Filter Posts by SubredditExclude Existing PostsBatch Through Posts.

Tip: If you need higher throughput, reduce the delay in Delay for Rate Control or adjust batch size in Batch Through Posts.

Step 4: Configure LLM Gating and Comment Aggregation

This stage uses Gemini to decide whether a post is relevant, then retrieves and aggregates comments for summarization.

  1. Open Primary LLM Check and confirm the Text prompt is set to =Does this reddit post {YOUR CONDITION}? Answer only with 'YES' or 'NO' ... {{ $input.all()[0].json.title }} ... {{ $input.all()[0].json.description }}.
  2. In Evaluate Condition, verify the condition compares {{ $input.all()[0].json.text }} to YES.
  3. Configure Fetch Post Comments with Post ID set to {{ $('Batch Through Posts').first().json.id }} and Subreddit set to {{ $('Batch Through Posts').first().json.subreddit }}.
  4. Credential Required: Connect your redditOAuth2Api credentials in Fetch Post Comments.
  5. Leave the code nodes Extract Comment Details and Combine Post and Comments unchanged to build the aggregated text block.

If Primary LLM Check outputs anything other than YES, Evaluate Condition will send the workflow back to Batch Through Posts without fetching comments.

Step 5: Set Up AI Summarization and Parsing

Gemini summarizes the aggregated content, while a structured output parser enforces a summary/tags schema.

  1. Open Content Summarizer LLM and confirm the Text prompt is set to =summarize the following reddit post... {{ $json.aggregated_text }}.
  2. Verify Structured Result Parser has the JSON example schema for summary and tags.
  3. Ensure Gemini Chat Engine is connected as the language model to Content Summarizer LLM.
  4. Ensure Gemini Prompt Model is connected as the language model to Primary LLM Check.
  5. Credential Required: Connect your googlePalmApi credentials in Gemini Chat Engine and Gemini Prompt Model.

AI sub-nodes like Structured Result Parser inherit credentials from their parent LLM node—add credentials to Gemini Chat Engine, not to the parser.

Step 6: Configure Supabase Output and Rate Control

This stage formats the summarized data and inserts it into Supabase, then waits before processing the next batch.

  1. In Format Data for Supabase, keep the mapping logic that sets reddit_id, title, url, summary, tags, post_date, upvotes, and num_comments.
  2. Open Insert Reddit Entry and set Table to reddit_posts and Data to Send to autoMapInputData.
  3. Credential Required: Connect your supabaseApi credentials in Insert Reddit Entry.
  4. Ensure Insert Reddit Entry outputs to Delay for Rate Control to avoid API rate limits.

Step 7: Test and Activate Your Workflow

Run a manual test to validate the end-to-end pipeline and then activate the schedule for production use.

  1. Click Execute Workflow using Manual Run Trigger.
  2. Confirm that Retrieve Saved Reddit Posts returns items, Filter Posts by Subreddit keeps valid posts, and Exclude Existing Posts removes duplicates.
  3. Verify the AI gate: Primary LLM Check should output YES to proceed to Fetch Post Comments.
  4. Check that Insert Reddit Entry writes rows to Supabase with summary and tags.
  5. Activate the workflow to enable Daily Schedule Trigger for ongoing runs.
🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Common Gotchas

  • Reddit credentials can expire or need specific permissions. If things break, check your Reddit app settings (client ID/secret) and the n8n credential status first.
  • If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
  • Default prompts in AI nodes are generic. Add your brand voice and strict relevance criteria early or you’ll be editing outputs forever.

Frequently Asked Questions

How long does it take to set up this Reddit Supabase summaries automation?

About 45 minutes if you already have Reddit, Supabase, and Gemini keys ready.

Do I need coding skills to automate Reddit Supabase summaries?

No. You’ll mostly paste credentials and tweak filters and prompts.

Is n8n free to use for this Reddit Supabase summaries workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Gemini API usage costs, which depend on how long the posts and comments are.

Where can I host n8n to run this automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

Can I customize this Reddit Supabase summaries workflow for different subreddits and criteria?

Yes, and you should. Update the subreddit filter logic in the “Filter Posts by Subreddit” step, then edit the natural-language criteria used by the “Primary LLM Check” so Gemini knows what “relevant” means for your niche. Common customizations include “only posts about a specific product category,” “exclude hiring/for-sale posts,” and “require a minimum upvote or comment count.” If you want different output formats, adjust the structured parser and the “Format Data for Supabase” mapping.

Why is my Reddit connection failing in this workflow?

Usually it’s expired or incorrect Reddit API credentials. Regenerate your client secret in Reddit’s developer app, then update the credential in n8n and re-run the manual trigger. If it still fails, check that your Reddit app type and permissions match what the workflow expects, and slow the batching down because rate limiting can look like random failures.

How many posts can this Reddit Supabase summaries automation handle?

Plenty for typical research use: dozens of saved posts per day is fine if you keep batching and rate control on. On n8n Cloud, your monthly execution limit depends on your plan. If you self-host, there’s no hard execution cap, but your server and API rate limits still matter.

Is this Reddit Supabase summaries automation better than using Zapier or Make?

Often, yes. This workflow relies on batching, deduping, and AI decisioning before you spend tokens summarizing, and n8n is simply more comfortable with that kind of logic. You also get a self-host option, which is a big deal if you want high volume without paying per-task pricing. Zapier or Make can still work if your version is simpler, like “save post → summarize → store,” and you don’t care about comment extraction or strict filtering. If you’re unsure, Talk to an automation expert and describe your volume and your criteria.

This is the kind of workflow you set up once and benefit from every day after. Your saved posts stop being a graveyard and start acting like a real research database.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Launch login modal Launch register modal