Gmail to Supabase, organized Reddit insights on demand
Reddit is full of the exact words your market uses. But if those insights arrive as a pile of Gmail alerts, they’re basically unsearchable the moment you close the tab.
This Gmail Supabase insights automation hits growth marketers first, honestly. But product managers and founders feel it too, because the best feature ideas and objection clues keep slipping through the cracks.
You’ll turn F5Bot emails into tagged, filterable entries in Supabase, then pull an AI summary of any thread on demand. The result is a living “voice of customer” database you can actually use.
How This Automation Works
Here’s the complete workflow you’ll be setting up:
n8n Workflow Template: Gmail to Supabase, organized Reddit insights on demand
flowchart LR
subgraph sg0["Message a model Flow"]
direction LR
n4["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/reddit.svg' width='40' height='40' /></div><br/>Get many comments in a post"]
n5["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/reddit.svg' width='40' height='40' /></div><br/>Get a post"]
n6["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Code in JavaScript2"]
n7["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/webhook.dark.svg' width='40' height='40' /></div><br/>Webhook"]
n8@{ icon: "mdi:robot", form: "rounded", label: "Message a model", pos: "b", h: 48 }
n9["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/webhook.dark.svg' width='40' height='40' /></div><br/>Respond to Webhook"]
n10["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/supabase.svg' width='40' height='40' /></div><br/>Update a row"]
n11["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/supabase.svg' width='40' height='40' /></div><br/>Get a row"]
n7 --> n11
n11 --> n5
n5 --> n4
n8 --> n9
n8 --> n10
n6 --> n8
n4 --> n6
end
subgraph sg1["Schedule Flow"]
direction LR
n0@{ icon: "mdi:message-outline", form: "rounded", label: "Gmail", pos: "b", h: 48 }
n1@{ icon: "mdi:play-circle", form: "rounded", label: "Schedule Trigger", pos: "b", h: 48 }
n2@{ icon: "mdi:robot", form: "rounded", label: "OpenAI", pos: "b", h: 48 }
n3@{ icon: "mdi:swap-vertical", form: "rounded", label: "Loop Over Items", pos: "b", h: 48 }
n12["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/supabase.svg' width='40' height='40' /></div><br/>Create a row"]
n0 --> n3
n2 --> n12
n12 --> n3
n3 --> n2
n1 --> n0
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n1 trigger
class n8,n2 ai
class n7,n9 api
class n6 code
classDef customIcon fill:none,stroke:none
class n4,n5,n6,n7,n9,n10,n11,n12 customIcon
Why This Matters: Reddit insights die in your inbox
F5Bot is great at one thing: telling you a keyword showed up on Reddit. The problem starts after that. You open an alert, skim it, think “we should use this,” and then it disappears into the Gmail void with 40 other messages. A week later you’re writing copy, planning a roadmap, or prepping sales talk tracks, and you can’t find the thread where someone explained the exact reason they won’t switch. It’s not just time. It’s lost context, missed patterns, and teams arguing from opinions instead of receipts.
The friction compounds. Here’s where it breaks down in real life.
- You end up re-reading the same types of threads because nothing is categorized or searchable.
- Great quotes never make it into docs, decks, or campaigns, so you keep rewriting messaging from memory.
- Competitor comparisons get missed because you only saw the post, not the comment chain where people get specific.
- By the time you share the “insight,” it’s a vague summary, not a linkable, verifiable source your team trusts.
What You’ll Build: Gmail-to-Supabase Reddit intelligence system
This workflow runs on a simple idea: every Reddit mention should become structured data, not another email you forget. On an hourly schedule, n8n checks your Gmail inbox for new F5Bot alerts and extracts the mentions inside. Each mention is then processed by an AI classifier (OpenAI Chat Model via n8n’s AI nodes) that tags topic and sentiment in plain terms your team can filter later. Those tagged mentions are stored in Supabase so you can search, sort, and build dashboards around them. When you need deeper context, a webhook lets you request an “AI Summary” for a specific record, and the workflow fetches the Reddit post plus its full comment chain, builds a readable thread view, and returns an actionable summary you can paste into Slack, a doc, or a product ticket.
The workflow starts with Gmail ingestion and batching so it can process multiple alerts cleanly. Then it shifts into classification and storage in Supabase. Finally, it supports on-demand deep dives that pull full Reddit context and generate a usable summary in seconds.
What You’re Building
| What Gets Automated | What You’ll Achieve |
|---|---|
|
|
Expected Results
Let’s say you track 10 keywords and F5Bot sends about 20 alerts per day. Manually, even a quick “open, skim, copy key lines, paste into a doc, add a label” routine is maybe 5 minutes each, which is around 2 hours daily. With this workflow, the hourly Gmail scan and classification runs without you, and you only spend time when you request a deep summary (often a minute to click, then a short wait for the AI reply). You get most of that time back, plus the insights are searchable later.
Before You Start
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Gmail to read F5Bot alert emails
- Supabase to store and query mentions
- OpenAI API key (get it from the OpenAI dashboard)
Skill level: Beginner. You’ll connect accounts, paste API keys, and test a few sample emails and thread summaries.
Want someone to build this for you? Talk to an automation expert (free 15-minute consultation).
Step by Step
Hourly inbox check in Gmail. A schedule trigger runs every hour, looks for new F5Bot alert emails, and extracts the mention payload from each message.
Batch processing for stability. Mentions are looped in batches so n8n can handle spikes (a busy subreddit day) without timing out or dropping items.
AI classification that makes the data usable. The OpenAI model analyzes each mention and assigns tags like topic and sentiment, which makes filtering in Supabase or a dashboard straightforward.
Store mentions, then summarize threads on demand. New records are inserted into Supabase immediately. When you request an “AI Summary” (through a webhook), the workflow fetches the Reddit post and comments, builds readable thread context, generates a summary, and writes it back to Supabase plus returns it to the requester.
You can easily modify the topics you tag for (for example “pricing,” “integrations,” “performance”) to match your product. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Schedule Trigger
This workflow starts every hour, pulling new Reddit mention emails and processing them automatically.
- Add and open Hourly Schedule Starter.
- Set the schedule rule to run every hour by keeping Interval set to
hours. - Confirm Hourly Schedule Starter is connected to Email Inbox Reader.
Step 2: Connect Gmail for Mention Intake
Emails provide the mention content used for classification and Supabase storage.
- Open Email Inbox Reader and set Operation to
getAllwith Return All enabled. - In Filters, set Sender to
[YOUR_EMAIL]. - Set Received After to
{{ $now.minus(1,"hour") }}. - Credential Required: Connect your gmailOAuth2 credentials.
[YOUR_EMAIL] is not replaced with a real sender address, Email Inbox Reader will return no results.Step 3: Set Up Batching and AI Mention Classification
The workflow batches email items and uses AI to classify mentions by category and sentiment.
- Open Batch Item Iterator and keep default settings to iterate through items from Email Inbox Reader.
- Open AI Mention Classifier and set Model to
gpt-5. - In the user message content, keep the expressions that extract email content:
{{ $json.html.split('\n')[3].removeTags().split(': ')[1].toString() }}and{{ $json.html.split('\n')[4].removeTags().split(': ')[0].toString().trim() }}. - Enable JSON Output by keeping
truein AI Mention Classifier. - Credential Required: Connect your openAiApi credentials.
Step 4: Configure Supabase Insert for Mentions
Each classified mention is inserted into Supabase for tracking and later enrichment.
- Open Insert Supabase Record and set Table to
all_mentions. - Map fields using the existing expressions, including title
{{ $('Email Inbox Reader').first().json.html.split('\n')[3].removeTags().split(': ')[1].toString() }}, body{{ $('Email Inbox Reader').first().json.html.split('\n')[4].removeTags().split(': ')[0].toString().trim() }}, and account{{ $('Email Inbox Reader').first().json.html.split('\n')[3].removeTags().split(':')[0].split(' ')[4].slice(1,-1).toString().toString() }}. - Keep the link field’s URL-extraction expression as provided, and posted_at set to
{{ $('Email Inbox Reader').first().json.headers.date.toDateTime() }}. - Map AI outputs to category, sentiment, and classification_reasoning using
{{ $('AI Mention Classifier').first().json.message.content.Category }},{{ $('AI Mention Classifier').first().json.message.content.Sentiment }}, and{{ $('AI Mention Classifier').first().json.message.content.Reasoning }}. - Credential Required: Connect your supabaseApi credentials.
Step 5: Configure Webhook, Reddit Retrieval, and Thread Insight Generation
This path enriches a stored mention by fetching Reddit post details and comments, then generating an insight summary.
- Open Incoming Webhook Listener and confirm Path is
8bc532ac-315a-4987-9f21-3d48c50f2b99, HTTP Method isPOST, and Response Mode isresponseNode. - In Fetch Supabase Entry, set Table to
all_mentionsand filter id with{{ $json.body.id }}. Credential Required: Connect your supabaseApi credentials. - Configure Retrieve Post Details with Operation
get, Post ID{{ $json.link.split("/comments/")[1].split("/")[0] }}, and Subreddit{{ $json.account.split("/r/")[1].split("/")[0] }}. Credential Required: Connect your redditOAuth2Api credentials. - Configure Fetch Post Comments with Resource
postComment, OperationgetAll, Return All enabled, Post ID{{ $json.id }}, and Subreddit{{ $json.subreddit }}. Credential Required: Connect your redditOAuth2Api credentials. - Keep the JavaScript in Build Comment Markdown to generate a single
markdownstring from nested comments. - In Thread Insight Generator, set Model to
gpt-5and keep the prompt input:{{"Main Post: " + $("Retrieve Post Details").first().json.selftext + "\nComments in markdown format: " + $json.markdown}}. Credential Required: Connect your openAiApi credentials. - Thread Insight Generator outputs to both Return Webhook Reply and Modify Supabase Entry in parallel.
- In Return Webhook Reply, set Respond With to
textand Response Body to{{ $('Thread Insight Generator').first().json.output[0].content[0].text }}. - In Modify Supabase Entry, set Operation to
update, filter by id{{ $("Fetch Supabase Entry").first().json.id }}and link with=*{{ $('Retrieve Post Details').first().json.id }}*. Map thread_summary to{{ $json.output[0].content[0].text }}and raw_full_thread to{{"Main Post: " + $("Retrieve Post Details").first().json.selftext + "\nComments in markdown format: " + $json.markdown}}. Credential Required: Connect your supabaseApi credentials.
Step 6: Test and Activate Your Workflow
Validate both the hourly email ingestion and the webhook-driven thread analysis.
- Click Execute Workflow to run Hourly Schedule Starter manually and confirm Email Inbox Reader returns recent emails.
- Verify a new row is created in
all_mentionsafter AI Mention Classifier and Insert Supabase Record complete. - Send a POST request to Incoming Webhook Listener with
{ "id": "YOUR_RECORD_ID" }to validate the enrichment path. - Confirm Return Webhook Reply returns the AI summary and Modify Supabase Entry updates thread_summary and raw_full_thread fields.
- Toggle the workflow to Active for production use.
Troubleshooting Tips
- Gmail credentials can expire or need specific permissions. If things break, check the connected Google account inside n8n’s Credentials menu first.
- If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- OpenAI prompts that ship “generic” will produce generic tags. Bake your categories and a few examples into the AI nodes early, or you will be cleaning labels in Supabase later.
Quick Answers
About 45 minutes if your Gmail, Supabase, and OpenAI accounts are ready.
No coding required. You’ll mainly connect accounts, then test with a real F5Bot email.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in OpenAI API costs (often just a few cents per batch of mentions, depending on volume).
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Yes, and you should. The fastest win is editing the AI Mention Classifier instructions so tags match your product language (features, integrations, pricing, bugs). You can also change the Gmail search query to watch multiple inboxes or labels, and adjust the Supabase tables/fields if your dashboard needs extra columns like “priority” or “owner.” If you don’t use Reddit summaries, you can disable the on-demand webhook path and keep only the “capture + classify” part.
Usually it’s expired Google OAuth access or the wrong Gmail permissions. Reconnect the Gmail credential in n8n, then confirm the workflow can read the mailbox/label your F5Bot alerts land in. If you recently changed your Google Workspace security settings, that can also silently block the connection until you re-authorize.
Plenty for most small teams: dozens to a few hundred mentions a day is typical, and batching keeps it stable.
Often, yes. The “capture emails, loop through mentions, classify with AI, store in Supabase, then support an on-demand webhook summary” flow is the kind of multi-path automation that gets awkward (and pricey) in Zapier. n8n also makes it easier to mix scheduled runs with webhook-triggered deep dives in the same workflow. Make can handle pieces of this, but the Reddit thread pull plus comment formatting tends to require more custom work. If you just need a simple “Gmail email → spreadsheet row,” Zapier is fine. Talk to an automation expert if you want a quick recommendation based on your volumes and team habits.
Once your Reddit signals live in Supabase instead of Gmail, they start compounding. Set it up once, and your next launch, landing page, and roadmap review gets a lot easier.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.