🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 22, 2026

YouTube to Google Sheets, enriched leads from comments

Lisa Granqvist Partner Workflow Automation Expert

YouTube comments are full of buying signals, partnership hints, and “tell me more” energy. Then you try to capture it. You copy usernames, lose the thread, and end up with a messy doc nobody trusts.

This is where marketing managers get stuck, and it’s brutal for a solo founder too. Even a content strategist running competitor research feels it. With YouTube lead enrichment automation, you turn commenters into researched prospects in a sheet your team can actually use.

Below is the workflow, the business outcome, and the practical setup details so you can stop “profile hunting” and start following up with context.

How This Automation Works

The full n8n workflow, from trigger to final output:

n8n Workflow Template: YouTube to Google Sheets, enriched leads from comments

The Problem: Turning YouTube Comments Into Usable Leads

When someone comments on a YouTube video, they’re raising their hand in public. But capturing that intent is weirdly manual. You open the video, scroll, copy usernames, maybe click through to a channel, then try to find anything useful (website, Instagram, company name, email). Half the time, you can’t tell if the commenter is a real prospect, a student, a spammer, or a competitor. And because the process is annoying, most teams “save it for later”… which usually means never.

It adds up fast. Here’s where it breaks down.

  • One high-performing video can create hundreds of comments, and manually skimming them turns into a weekly time sink.
  • Notes end up scattered across spreadsheets, DMs, and screenshots, so nobody knows what’s already been checked.
  • Basic exports give you text, not context, which means outreach feels generic (and gets ignored).
  • Without a “processed” flag, the same comments get revisited, or worse, forgotten completely.

The Solution: Scrape Comments, Enrich Authors, Save Clean Leads

This n8n workflow watches a Google Sheet for YouTube video URLs, then pulls in the comments automatically using an Apify actor built for YouTube comment extraction. Those raw comments are stored in a structured way so you can track what came from which video. After that, the workflow grabs unprocessed comment records and hands them to an AI agent powered by an OpenRouter chat model. The agent researches the author using Google Search (via Serper) and targeted web scrapers (Apify website markdown scraper and Instagram profile scraper) to find publicly available details like social links, bios, and sometimes contact info. Finally, it creates or updates a “lead” row in Google Sheets and marks the original comment as processed so it won’t loop forever.

The workflow can run on a schedule (hourly to scrape new videos, and more frequently to enrich new comments) or be kicked off manually when you’re testing. You can also trigger enrichment via Telegram chat, which is handy when you want to research a specific commenter right now, not “on the next run.”

What You Get: Automation vs. Results

Example: What This Looks Like

Say you review 3 YouTube videos per week (yours or competitors) and each video has about 120 comments worth scanning. Manually, even 2 minutes per commenter to click a profile, search their name, and jot a note is roughly 12 hours of work. With this workflow, you paste the 3 URLs into Google Sheets (about 5 minutes), then let enrichment run in the background while it fills your “leads” sheet. You still choose who to contact, but the research and row-building are done.

What You’ll Need

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • Google Sheets to store videos, comments, and leads.
  • Apify to scrape YouTube comments and profiles.
  • OpenRouter API key (get it from your OpenRouter dashboard).
  • Serper API key (get it from your Serper.dev account settings).

Skill level: Intermediate. You’ll connect accounts, paste API keys, and map a few fields into Google Sheets.

Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).

How It Works

New video URLs appear in your sheet. On the hourly schedule (or manual start), n8n reads your “videos” tab in Google Sheets and looks for URLs that haven’t been processed yet.

Comments get scraped and stored. For each URL, an HTTP request runs the YouTube comments Apify actor, then the workflow appends the raw results into your “comments” sheet and flags the video as “scrapped” so it won’t run again.

Unprocessed commenters get researched. On the minute schedule (and optionally from a Telegram chat trigger), the workflow pulls comment records that are still marked unprocessed, prepares the AI input, and sends it to the research agent using an OpenRouter chat model. The agent uses Serper for Google search and Apify scrapers for website markdown and Instagram profiles when those sources help.

Enriched leads land in Google Sheets. n8n creates a new lead row, updates it with what was found (social links, bio, possible contact fields), then marks the original comment record as processed so the queue stays clean.

You can easily modify the lead fields to match your pipeline (industry, offer fit, priority) based on your needs. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Manual, Schedule, and Chat Triggers

Set up all entry points so the workflow can run manually, on schedules, and via chat.

  1. Open Manual Execution Start and keep it as the manual trigger for on-demand runs.
  2. Configure Hourly Schedule Trigger with the rule interval set to hours so it starts the video URL retrieval automatically.
  3. Configure Minute Schedule Trigger with the rule interval set to minutes and 1 to continuously feed AI processing.
  4. Enable Incoming Chat Trigger so chat requests can go directly into Research Automation Agent.
  5. Confirm the execution flow: Manual Execution StartRetrieve Video URL List, Hourly Schedule TriggerRetrieve Video URL List, Minute Schedule TriggerPrepare AI Input, and Incoming Chat TriggerResearch Automation Agent.

Step 2: Connect Google Sheets for Video and Comment Intake

These nodes read video URLs, mark them as processed, and store scraped comment data.

  1. Open Retrieve Video URL List and set Document to [YOUR_ID] and Sheet to videos (gid 0).
  2. In Retrieve Video URL List, keep the filter on scrapped so only unprocessed URLs are returned.
  3. Open Flag Video URL Processed and set Operation to update, mapping url to {{ $json.url }} and scrapped to TRUE, matching on url.
  4. Open Append Scraped Comments and set Operation to append with Sheet set to comments (gid 6484598), mapping author to {{ $json.author }}.
  5. Credential Required: Connect your googleSheetsOAuth2Api credentials for Retrieve Video URL List, Flag Video URL Processed, and Append Scraped Comments.
  6. Confirm parallel execution: Retrieve Video URL List outputs to both Fetch Video Comments API and Flag Video URL Processed in parallel.

If your sheet tabs are renamed, update the Sheet selections in Retrieve Video URL List, Flag Video URL Processed, and Append Scraped Comments to match the new names.

Step 3: Configure Comment Scraping via API

This step retrieves YouTube comments for each URL and forwards them for storage.

  1. Open Fetch Video Comments API and set Method to POST.
  2. Set the URL to https://api.apify.com/v2/acts/mohamedgb00714~youtube-video-comments/run-sync-get-dataset-items?token=[CONFIGURE_YOUR_TOKEN].
  3. Enable Send Body and set Content Type to json.
  4. Set the JSON body to { "videoUrl": "{{ $json.url }}" }.
  5. Verify the flow Fetch Video Comments APIAppend Scraped Comments is connected.

Replace [CONFIGURE_YOUR_TOKEN] with a valid Apify token in Fetch Video Comments API or requests will fail.

Step 4: Set Up the AI Agent and Memory

These nodes generate the AI instructions, maintain context, and run autonomous enrichment.

  1. Open Prepare AI Input and keep the provided jsCode that sets chatInput and a randomized sessionId.
  2. Confirm the flow Prepare AI InputResearch Automation Agent is connected.
  3. Open Research Automation Agent and keep maxIterations set to 100 with the detailed systemMessage instructions.
  4. Ensure OpenRouter Conversational Model is connected as the language model with Model set to google/gemini-2.5-flash-preview-05-20.
  5. Credential Required: Connect your openRouterApi credentials on the parent Research Automation Agent configuration (the model is attached as a sub-node).
  6. Keep Session Memory Buffer connected to Research Automation Agent for ongoing conversational context.

Step 5: Configure AI Tools for Search, Scraping, and Lead Updates

These tools enable the agent to read comments, search the web, scrape profiles, and update lead records.

  1. Open Retrieve Comment Records, set it to filter on processed, and keep Tool Description as list of comments to process.
  2. Open Google Search Tool and verify URL https://google.serper.dev/search, header X-API-KEY set to [CONFIGURE_YOUR_API_KEY], and body parameters q set to {{ /*n8n-auto-generated-fromAI-override*/ $fromAI('parameters0_Value', ``, 'string') }}, location set to United States, and num set to 100.
  3. Open Fetch Website Markdown and keep the JSON body using {{ $fromAI('crawelEnabled', 'get other links from this website', 'boolean', false) }} and {{ $fromAI('url', 'target url', 'string') }}.
  4. Open Instagram Profile Scraper and keep the JSON body with {{ $fromAI('username', 'instagram profile user name', 'string') }} for instagramUsernames.
  5. Open Create Lead Row and set Operation to appendOrUpdate with username mapped to {{ /*n8n-auto-generated-fromAI-override*/ $fromAI('username', ``, 'string') }}.
  6. Open Update Lead Details and keep Operation as appendOrUpdate, mapping fields like email to {{ /*n8n-auto-generated-fromAI-override*/ $fromAI('email', ``, 'string') }} and short Description to {{ /*n8n-auto-generated-fromAI-override*/ $fromAI('short_Description', ``, 'string') }}.
  7. Open Set Comment Processed and set processed to {{ $fromAI('processed', ``, 'boolean') }} and match on avatarAccessibilityText using {{ /*n8n-auto-generated-fromAI-override*/ $fromAI('avatarAccessibilityText__using_to_match_', ``, 'string') }}.
  8. Credential Required: Connect your googleSheetsOAuth2Api credentials on the parent Research Automation Agent tool configuration for Retrieve Comment Records, Create Lead Row, Update Lead Details, and Set Comment Processed.

Replace all [CONFIGURE_YOUR_TOKEN] and [CONFIGURE_YOUR_API_KEY] placeholders in Google Search Tool, Fetch Website Markdown, and Instagram Profile Scraper to avoid authentication errors.

Step 6: Test and Activate Your Workflow

Run a controlled test to confirm data flows into your sheets and the agent completes a full enrichment cycle.

  1. Click Execute Workflow and use Manual Execution Start to trigger a test run.
  2. Verify that Retrieve Video URL List pulls an unprocessed URL and that Fetch Video Comments API returns comment data.
  3. Confirm Append Scraped Comments adds rows to the comments sheet and Flag Video URL Processed marks the URL as TRUE.
  4. Trigger Minute Schedule Trigger (or run Prepare AI Input manually) and confirm Research Automation Agent writes to Create Lead Row, Update Lead Details, and Set Comment Processed.
  5. When tests succeed, toggle the workflow to Active so schedules and chat triggers run in production.
🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Common Gotchas

  • Google Sheets OAuth credentials can expire or lose permissions if a workspace policy changes. If rows stop writing, check the Google Sheets credential in n8n and confirm the spreadsheet is still shared with the right account.
  • If you’re using Wait nodes or external scraping (Apify actors), processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
  • Default prompts in AI nodes are generic. Add your brand voice early or you’ll be editing outputs forever.

Frequently Asked Questions

How long does it take to set up this YouTube lead enrichment automation?

About an hour if you already have your API keys.

Do I need coding skills to automate YouTube lead enrichment?

No. You’ll mostly connect accounts and map fields into the right Google Sheets columns.

Is n8n free to use for this YouTube lead enrichment workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in OpenRouter usage plus Serper and Apify costs (which depend on how many comments you process).

Where can I host n8n to run this automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

Can I customize this YouTube lead enrichment automation for influencer-only leads?

Yes, and honestly it’s a smart filter. In the “Prepare AI Input” code step, add rules that tell the agent to look for follower counts, creator keywords, or brand partnerships. Then only write a lead row when the agent returns a match, using the If node before “Create Lead Row.” Common tweaks include saving a “lead type” field, adding a priority score, and storing the profile link you actually plan to DM.

Why is my Google Sheets connection failing in this workflow?

Usually it’s an expired OAuth token or the spreadsheet permissions changed. Reconnect the Google Sheets credential in n8n, then confirm the same account can edit the file. If it fails only on certain rows, check for renamed tabs or missing columns in your “videos,” “comments,” or “leads” sheets.

How many comments can this YouTube lead enrichment automation handle?

On n8n Cloud Starter, you can run up to 2,500 executions per month, so capacity depends on how you batch comments. If you self-host, there’s no execution cap, and you’ll mainly be limited by Apify/Serper rate limits and your server. Practically, most small teams run this comfortably for a few videos per day and scale up once the sheet structure and prompts are dialed in.

Is this YouTube lead enrichment automation better than using Zapier or Make?

For this workflow, n8n is usually a better fit because the AI-agent style enrichment, branching logic, and batching get expensive fast on “task-based” tools. You also get the option to self-host, which changes the math if you’re processing lots of comments. Zapier or Make can still work if you’re only doing simple capture and tagging, with no enrichment. Once you add web search, scraping, and a “processed” loop, n8n tends to be less fragile. Talk to an automation expert if you want a quick recommendation based on your volume.

This is the kind of workflow you set up once and then quietly benefit from every week. The sheet stays clean, the leads get richer, and you stop wasting time chasing breadcrumbs.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Launch login modal Launch register modal