YouTube to Google Sheets, enriched leads from comments
YouTube comments are full of buying signals, partnership hints, and “tell me more” energy. Then you try to capture it. You copy usernames, lose the thread, and end up with a messy doc nobody trusts.
This is where marketing managers get stuck, and it’s brutal for a solo founder too. Even a content strategist running competitor research feels it. With YouTube lead enrichment automation, you turn commenters into researched prospects in a sheet your team can actually use.
Below is the workflow, the business outcome, and the practical setup details so you can stop “profile hunting” and start following up with context.
How This Automation Works
The full n8n workflow, from trigger to final output:
n8n Workflow Template: YouTube to Google Sheets, enriched leads from comments
flowchart LR
subgraph sg0["When chat message received Flow"]
direction LR
n5@{ icon: "mdi:play-circle", form: "rounded", label: "When chat message received", pos: "b", h: 48 }
n6@{ icon: "mdi:robot", form: "rounded", label: "AI Agent", pos: "b", h: 48 }
n7@{ icon: "mdi:brain", form: "rounded", label: "OpenRouter Chat Model", pos: "b", h: 48 }
n8@{ icon: "mdi:database", form: "rounded", label: "get comments", pos: "b", h: 48 }
n9@{ icon: "mdi:memory", form: "rounded", label: "Simple Memory", pos: "b", h: 48 }
n10@{ icon: "mdi:web", form: "rounded", label: "search google", pos: "b", h: 48 }
n11@{ icon: "mdi:web", form: "rounded", label: "get url mardown", pos: "b", h: 48 }
n12@{ icon: "mdi:database", form: "rounded", label: "create a row for new search ..", pos: "b", h: 48 }
n13@{ icon: "mdi:database", form: "rounded", label: "update result for user", pos: "b", h: 48 }
n14@{ icon: "mdi:web", form: "rounded", label: "instagram full profile scraper", pos: "b", h: 48 }
n15@{ icon: "mdi:database", form: "rounded", label: "mark comment as processed", pos: "b", h: 48 }
n16@{ icon: "mdi:play-circle", form: "rounded", label: "Schedule Trigger", pos: "b", h: 48 }
n17["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Code"]
n17 --> n6
n8 -.-> n6
n9 -.-> n6
n10 -.-> n6
n11 -.-> n6
n16 --> n17
n7 -.-> n6
n13 -.-> n6
n15 -.-> n6
n5 --> n6
n14 -.-> n6
n12 -.-> n6
end
subgraph sg1["When clicking ‘Execute workflow’ Flow"]
direction LR
n0@{ icon: "mdi:play-circle", form: "rounded", label: "When clicking ‘Execute workf..", pos: "b", h: 48 }
n1["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>HTTP apify get comments from.."]
n2@{ icon: "mdi:database", form: "rounded", label: "mark video url as scrapped", pos: "b", h: 48 }
n3@{ icon: "mdi:database", form: "rounded", label: "Get rvideo urls", pos: "b", h: 48 }
n4@{ icon: "mdi:database", form: "rounded", label: "Save scrapped comments", pos: "b", h: 48 }
n18@{ icon: "mdi:play-circle", form: "rounded", label: "Schedule Trigger1", pos: "b", h: 48 }
n3 --> n1
n3 --> n2
n18 --> n3
n1 --> n4
n0 --> n3
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n5,n16,n0,n18 trigger
class n6 ai
class n7 aiModel
class n9 ai
class n8,n12,n13,n15,n2,n3,n4 database
class n10,n11,n14,n1 api
class n17 code
classDef customIcon fill:none,stroke:none
class n17,n1 customIcon
The Problem: Turning YouTube Comments Into Usable Leads
When someone comments on a YouTube video, they’re raising their hand in public. But capturing that intent is weirdly manual. You open the video, scroll, copy usernames, maybe click through to a channel, then try to find anything useful (website, Instagram, company name, email). Half the time, you can’t tell if the commenter is a real prospect, a student, a spammer, or a competitor. And because the process is annoying, most teams “save it for later”… which usually means never.
It adds up fast. Here’s where it breaks down.
- One high-performing video can create hundreds of comments, and manually skimming them turns into a weekly time sink.
- Notes end up scattered across spreadsheets, DMs, and screenshots, so nobody knows what’s already been checked.
- Basic exports give you text, not context, which means outreach feels generic (and gets ignored).
- Without a “processed” flag, the same comments get revisited, or worse, forgotten completely.
The Solution: Scrape Comments, Enrich Authors, Save Clean Leads
This n8n workflow watches a Google Sheet for YouTube video URLs, then pulls in the comments automatically using an Apify actor built for YouTube comment extraction. Those raw comments are stored in a structured way so you can track what came from which video. After that, the workflow grabs unprocessed comment records and hands them to an AI agent powered by an OpenRouter chat model. The agent researches the author using Google Search (via Serper) and targeted web scrapers (Apify website markdown scraper and Instagram profile scraper) to find publicly available details like social links, bios, and sometimes contact info. Finally, it creates or updates a “lead” row in Google Sheets and marks the original comment as processed so it won’t loop forever.
The workflow can run on a schedule (hourly to scrape new videos, and more frequently to enrich new comments) or be kicked off manually when you’re testing. You can also trigger enrichment via Telegram chat, which is handy when you want to research a specific commenter right now, not “on the next run.”
What You Get: Automation vs. Results
| What This Workflow Automates | Results You’ll Get |
|---|---|
|
|
Example: What This Looks Like
Say you review 3 YouTube videos per week (yours or competitors) and each video has about 120 comments worth scanning. Manually, even 2 minutes per commenter to click a profile, search their name, and jot a note is roughly 12 hours of work. With this workflow, you paste the 3 URLs into Google Sheets (about 5 minutes), then let enrichment run in the background while it fills your “leads” sheet. You still choose who to contact, but the research and row-building are done.
What You’ll Need
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Google Sheets to store videos, comments, and leads.
- Apify to scrape YouTube comments and profiles.
- OpenRouter API key (get it from your OpenRouter dashboard).
- Serper API key (get it from your Serper.dev account settings).
Skill level: Intermediate. You’ll connect accounts, paste API keys, and map a few fields into Google Sheets.
Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).
How It Works
New video URLs appear in your sheet. On the hourly schedule (or manual start), n8n reads your “videos” tab in Google Sheets and looks for URLs that haven’t been processed yet.
Comments get scraped and stored. For each URL, an HTTP request runs the YouTube comments Apify actor, then the workflow appends the raw results into your “comments” sheet and flags the video as “scrapped” so it won’t run again.
Unprocessed commenters get researched. On the minute schedule (and optionally from a Telegram chat trigger), the workflow pulls comment records that are still marked unprocessed, prepares the AI input, and sends it to the research agent using an OpenRouter chat model. The agent uses Serper for Google search and Apify scrapers for website markdown and Instagram profiles when those sources help.
Enriched leads land in Google Sheets. n8n creates a new lead row, updates it with what was found (social links, bio, possible contact fields), then marks the original comment record as processed so the queue stays clean.
You can easily modify the lead fields to match your pipeline (industry, offer fit, priority) based on your needs. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Manual, Schedule, and Chat Triggers
Set up all entry points so the workflow can run manually, on schedules, and via chat.
- Open Manual Execution Start and keep it as the manual trigger for on-demand runs.
- Configure Hourly Schedule Trigger with the rule interval set to
hoursso it starts the video URL retrieval automatically. - Configure Minute Schedule Trigger with the rule interval set to
minutesand1to continuously feed AI processing. - Enable Incoming Chat Trigger so chat requests can go directly into Research Automation Agent.
- Confirm the execution flow: Manual Execution Start → Retrieve Video URL List, Hourly Schedule Trigger → Retrieve Video URL List, Minute Schedule Trigger → Prepare AI Input, and Incoming Chat Trigger → Research Automation Agent.
Step 2: Connect Google Sheets for Video and Comment Intake
These nodes read video URLs, mark them as processed, and store scraped comment data.
- Open Retrieve Video URL List and set Document to
[YOUR_ID]and Sheet tovideos(gid0). - In Retrieve Video URL List, keep the filter on scrapped so only unprocessed URLs are returned.
- Open Flag Video URL Processed and set Operation to
update, mapping url to{{ $json.url }}and scrapped toTRUE, matching on url. - Open Append Scraped Comments and set Operation to
appendwith Sheet set tocomments(gid6484598), mapping author to{{ $json.author }}. - Credential Required: Connect your googleSheetsOAuth2Api credentials for Retrieve Video URL List, Flag Video URL Processed, and Append Scraped Comments.
- Confirm parallel execution: Retrieve Video URL List outputs to both Fetch Video Comments API and Flag Video URL Processed in parallel.
Step 3: Configure Comment Scraping via API
This step retrieves YouTube comments for each URL and forwards them for storage.
- Open Fetch Video Comments API and set Method to
POST. - Set the URL to
https://api.apify.com/v2/acts/mohamedgb00714~youtube-video-comments/run-sync-get-dataset-items?token=[CONFIGURE_YOUR_TOKEN]. - Enable Send Body and set Content Type to
json. - Set the JSON body to
{ "videoUrl": "{{ $json.url }}" }. - Verify the flow Fetch Video Comments API → Append Scraped Comments is connected.
[CONFIGURE_YOUR_TOKEN] with a valid Apify token in Fetch Video Comments API or requests will fail.Step 4: Set Up the AI Agent and Memory
These nodes generate the AI instructions, maintain context, and run autonomous enrichment.
- Open Prepare AI Input and keep the provided jsCode that sets chatInput and a randomized sessionId.
- Confirm the flow Prepare AI Input → Research Automation Agent is connected.
- Open Research Automation Agent and keep maxIterations set to
100with the detailed systemMessage instructions. - Ensure OpenRouter Conversational Model is connected as the language model with Model set to
google/gemini-2.5-flash-preview-05-20. - Credential Required: Connect your openRouterApi credentials on the parent Research Automation Agent configuration (the model is attached as a sub-node).
- Keep Session Memory Buffer connected to Research Automation Agent for ongoing conversational context.
Step 5: Configure AI Tools for Search, Scraping, and Lead Updates
These tools enable the agent to read comments, search the web, scrape profiles, and update lead records.
- Open Retrieve Comment Records, set it to filter on processed, and keep Tool Description as
list of comments to process. - Open Google Search Tool and verify URL
https://google.serper.dev/search, header X-API-KEY set to[CONFIGURE_YOUR_API_KEY], and body parameters q set to{{ /*n8n-auto-generated-fromAI-override*/ $fromAI('parameters0_Value', ``, 'string') }}, location set toUnited States, and num set to100. - Open Fetch Website Markdown and keep the JSON body using
{{ $fromAI('crawelEnabled', 'get other links from this website', 'boolean', false) }}and{{ $fromAI('url', 'target url', 'string') }}. - Open Instagram Profile Scraper and keep the JSON body with
{{ $fromAI('username', 'instagram profile user name', 'string') }}for instagramUsernames. - Open Create Lead Row and set Operation to
appendOrUpdatewith username mapped to{{ /*n8n-auto-generated-fromAI-override*/ $fromAI('username', ``, 'string') }}. - Open Update Lead Details and keep Operation as
appendOrUpdate, mapping fields like email to{{ /*n8n-auto-generated-fromAI-override*/ $fromAI('email', ``, 'string') }}and short Description to{{ /*n8n-auto-generated-fromAI-override*/ $fromAI('short_Description', ``, 'string') }}. - Open Set Comment Processed and set processed to
{{ $fromAI('processed', ``, 'boolean') }}and match on avatarAccessibilityText using{{ /*n8n-auto-generated-fromAI-override*/ $fromAI('avatarAccessibilityText__using_to_match_', ``, 'string') }}. - Credential Required: Connect your googleSheetsOAuth2Api credentials on the parent Research Automation Agent tool configuration for Retrieve Comment Records, Create Lead Row, Update Lead Details, and Set Comment Processed.
[CONFIGURE_YOUR_TOKEN] and [CONFIGURE_YOUR_API_KEY] placeholders in Google Search Tool, Fetch Website Markdown, and Instagram Profile Scraper to avoid authentication errors.Step 6: Test and Activate Your Workflow
Run a controlled test to confirm data flows into your sheets and the agent completes a full enrichment cycle.
- Click Execute Workflow and use Manual Execution Start to trigger a test run.
- Verify that Retrieve Video URL List pulls an unprocessed URL and that Fetch Video Comments API returns comment data.
- Confirm Append Scraped Comments adds rows to the
commentssheet and Flag Video URL Processed marks the URL asTRUE. - Trigger Minute Schedule Trigger (or run Prepare AI Input manually) and confirm Research Automation Agent writes to Create Lead Row, Update Lead Details, and Set Comment Processed.
- When tests succeed, toggle the workflow to Active so schedules and chat triggers run in production.
Common Gotchas
- Google Sheets OAuth credentials can expire or lose permissions if a workspace policy changes. If rows stop writing, check the Google Sheets credential in n8n and confirm the spreadsheet is still shared with the right account.
- If you’re using Wait nodes or external scraping (Apify actors), processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- Default prompts in AI nodes are generic. Add your brand voice early or you’ll be editing outputs forever.
Frequently Asked Questions
About an hour if you already have your API keys.
No. You’ll mostly connect accounts and map fields into the right Google Sheets columns.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in OpenRouter usage plus Serper and Apify costs (which depend on how many comments you process).
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Yes, and honestly it’s a smart filter. In the “Prepare AI Input” code step, add rules that tell the agent to look for follower counts, creator keywords, or brand partnerships. Then only write a lead row when the agent returns a match, using the If node before “Create Lead Row.” Common tweaks include saving a “lead type” field, adding a priority score, and storing the profile link you actually plan to DM.
Usually it’s an expired OAuth token or the spreadsheet permissions changed. Reconnect the Google Sheets credential in n8n, then confirm the same account can edit the file. If it fails only on certain rows, check for renamed tabs or missing columns in your “videos,” “comments,” or “leads” sheets.
On n8n Cloud Starter, you can run up to 2,500 executions per month, so capacity depends on how you batch comments. If you self-host, there’s no execution cap, and you’ll mainly be limited by Apify/Serper rate limits and your server. Practically, most small teams run this comfortably for a few videos per day and scale up once the sheet structure and prompts are dialed in.
For this workflow, n8n is usually a better fit because the AI-agent style enrichment, branching logic, and batching get expensive fast on “task-based” tools. You also get the option to self-host, which changes the math if you’re processing lots of comments. Zapier or Make can still work if you’re only doing simple capture and tagging, with no enrichment. Once you add web search, scraping, and a “processed” loop, n8n tends to be less fragile. Talk to an automation expert if you want a quick recommendation based on your volume.
This is the kind of workflow you set up once and then quietly benefit from every week. The sheet stays clean, the leads get richer, and you stop wasting time chasing breadcrumbs.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.