🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 21, 2026

Google Sheets + EmailListVerify: emails from sites

Lisa Granqvist Partner Workflow Automation Expert

You’ve got a spreadsheet full of website domains. But turning that into real emails means endless tab hopping, copy-paste, and a weird amount of “why is this site blocking me?” frustration.

This email scraping automation hits lead gen specialists hardest. A marketing ops person cleaning lists feels it too, and so does a consultant building targeted outreach for clients. The outcome is simple: a cleaner outreach list with emails filled in, without doing the same manual steps all day.

You’ll see how this n8n workflow pulls domains from Google Sheets, tries to find emails directly on the site first, then uses EmailListVerify only when it needs to. Less cost. Less clicking. More usable leads.

How This Automation Works

The full n8n workflow, from trigger to final output:

n8n Workflow Template: Google Sheets + EmailListVerify: emails from sites

The Problem: Turning domains into emails is painfully manual

“We have the websites.” That’s the sentence that sounds like progress, right up until you have to turn those sites into a usable outreach list. Someone opens each domain, hunts for a contact page, scans the footer, copies an email, then pastes it back into a sheet. Some sites hide emails behind forms. Others load content with scripts your quick manual scan misses. And even when you do find an address, it’s often generic (contact@, info@), which is fine for small businesses but a waste of time if you’re chasing enterprise.

The friction compounds. Here’s where it usually breaks down.

  • One person can burn about 2–3 minutes per website just opening tabs and searching for “@”.
  • Copy-paste mistakes happen constantly, especially when you’re doing 50+ sites in a sitting.
  • Email finder tools cost money per lookup, so using them on every single domain gets expensive fast.
  • You end up with inconsistent data, which means your outreach tool imports get messy and you spend another hour cleaning.

The Solution: Scrape first, then verify and fill your sheet

This workflow starts with a Google Sheet of website URLs (using a template). When you run it, n8n pulls the rows, makes sure every URL is formatted correctly (so “example.com” becomes a valid web address), then requests the site content. If an email address is detected in the page content, it captures it immediately. If nothing shows up, the workflow switches to a fallback and asks EmailListVerify’s email finder API to guess an email for that domain. Finally, it expands the results (so multiple emails can become multiple rows) and writes the output back into your Google Sheet, ready for outreach.

It begins with a manual run trigger, so you control when it runs. Google Sheets supplies the input list, HTTP requests do the scraping, and EmailListVerify only gets called when scraping comes up empty. Then the workflow merges everything and updates your sheet with the emails and derived domains.

What You Get: Automation vs. Results

Example: What This Looks Like

Say you’ve got 100 websites to turn into an outreach list. Manually, if you spend about 3 minutes per site between opening pages, searching, and pasting back into Sheets, that’s roughly 5 hours. With this workflow, you paste the domains into the template once and run it. Even if the scraping and API calls take about 30 minutes of background processing, your “hands-on” time drops to a few minutes, and the sheet ends up populated and ready to filter.

What You’ll Need

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • Google Sheets for your input list and output results
  • EmailListVerify to find emails when scraping fails
  • EmailListVerify API key (get it from your EmailListVerify account)

Skill level: Beginner. You’ll mainly connect accounts, paste in your API key, and select the right Google Sheet.

Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).

How It Works

You run it when you’re ready. The workflow uses a manual trigger, then reads your website list from Google Sheets so you can control timing and batches.

URLs get cleaned up automatically. If your sheet has “example.com” without https://, the workflow adds the proper scheme so the scrape doesn’t fail for silly reasons.

The workflow tries the cheapest path first. It pulls the website HTML with an HTTP request and scans for emails. When emails are found, they’re captured and moved forward. If not, an “if” check routes that website to the EmailListVerify API to find an address from the domain.

Results are normalized and written back. The workflow derives a domain from the detected email, expands multiple emails into usable rows, then updates the output in Google Sheets so you can sort, filter, and export.

You can easily modify the fallback behavior to prioritize certain email patterns (like contact@) or to skip the API call entirely for domains you don’t want to pay to enrich. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Manual Run Trigger

This workflow starts manually, making it ideal for on-demand email extraction runs.

  1. Add and open Manual Run Trigger.
  2. Leave default settings as-is since this node does not require configuration.
  3. (Optional) Keep Flowpast Branding as a reference note for documentation; it does not affect execution.

Tip: Manual triggers are best for testing—later you can swap this out for a scheduled trigger if needed.

Step 2: Connect Google Sheets

These nodes load input websites and write output email results to Google Sheets.

  1. Open Fetch Input Rows and set Document ID to https://docs.google.com/spreadsheets/d/1VOTFM8UeWHhJbtBM7SRca6vsVJlRUXzX71kjJ8n2jUY/edit?gid=0#.
  2. Set Sheet Name to Input in Fetch Input Rows.
  3. Credential Required: Connect your googleSheetsOAuth2Api credentials to Fetch Input Rows.
  4. Open Update Output Sheet and confirm Operation is appendOrUpdate.
  5. Set Document ID to https://docs.google.com/spreadsheets/d/1VOTFM8UeWHhJbtBM7SRca6vsVJlRUXzX71kjJ8n2jUY/edit?gid=1538095319#gid=1538095319 and Sheet Name to Output.
  6. Credential Required: Connect your googleSheetsOAuth2Api credentials to Update Output Sheet.

⚠️ Common Pitfall: The Input and Output sheet tabs must exist and match the sheet names exactly.

Step 3: Set Up URL Normalization and Content Retrieval

This stage prepares URLs and fetches website content for email extraction. Fetch Input Rows outputs to both Ensure URL Scheme and Combine Streams in parallel.

  1. Open Ensure URL Scheme to verify the code prepends http:// when a URL is missing a scheme.
  2. Open Retrieve Site Content and set URL to ={{ $json.url }}.
  3. Open Capture Detected Emails and set the Email array value to ={{$json.data.match(/(?:[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Za-z]{2,})/g)}}.
  4. Confirm Combine Streams uses Mode combine and Combine By combineByPosition to align fetched content with original rows.

Tip: If emails aren’t detected, verify the source websites return HTML content and are publicly accessible.

Step 4: Validate Email Presence and Split Emails

Only rows with detected emails move forward, and each email is expanded into its own item.

  1. Open Validate Email Presence and confirm the condition checks leftValue ={{ $json.Email }} with the operator lengthGt and rightValue 0.
  2. Open Expand Email List and set Field To Split Out to Email.
  3. In Expand Email List, set Fields To Include to website and Destination Field Name to email.

⚠️ Common Pitfall: The Validate Email Presence node expects an array in Email. If it is empty or not an array, the flow routes to the alternate branch.

Step 5: Configure Domain Enrichment and Email Verification

Rows without detected emails are enriched using domain extraction and the EmailListVerify API.

  1. Open Derive Domain From Site to ensure the code extracts the domain from the website field and writes it to domain.
  2. Open Query EmailListVerify and set URL to https://api.emaillistverify.com/api/findContact.
  3. Set Method to POST and JSON Body to ={ "domain": "{{ $json.domain }}" } .
  4. Credential Required: Connect your httpHeaderAuth credentials to Query EmailListVerify.
  5. Credential Required: Connect your httpQueryAuth credentials to Query EmailListVerify.
  6. Open Derive Domain From Email to ensure the code extracts the domain from the returned email field and writes it to website.

Tip: If the API returns no emails, confirm the domain extraction logic matches the input URLs and that your API key is valid.

Step 6: Configure Output to Google Sheets

All successful email results—whether scraped or verified—are written to the Output sheet.

  1. Ensure Expand Email List connects directly to Update Output Sheet to write detected emails.
  2. Ensure Derive Domain From Email also connects to Update Output Sheet to write verified emails.
  3. In Update Output Sheet, keep the auto-mapped columns for website, email, and confidence.

Step 7: Test and Activate Your Workflow

Run the workflow end-to-end to confirm data flows from Input to Output.

  1. Click Execute Workflow and manually run Manual Run Trigger.
  2. Verify that Fetch Input Rows loads website URLs from the Input sheet.
  3. Confirm that Update Output Sheet appends rows containing website and email.
  4. When results look correct, toggle the workflow to Active for production use.
🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Common Gotchas

  • Google Sheets credentials can expire or need specific permissions. If things break, check the n8n Credentials tab and confirm the account still has access to that sheet.
  • If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
  • EmailListVerify limits can bite you during big runs. If you see failed HTTP requests, check your API key, your plan usage, and whether the API is rate limiting you.

Frequently Asked Questions

How long does it take to set up this email scraping automation?

About 30 minutes if your sheet and API key are ready.

Do I need coding skills to automate email scraping from websites?

No. You’ll connect Google Sheets and paste in an EmailListVerify API key. The workflow logic is already built, so you’re mostly configuring inputs and checking the output formatting.

Is n8n free to use for this email scraping automation workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in EmailListVerify API usage costs, which depend on your plan and how often the fallback finder runs.

Where can I host n8n to run this automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

Can I customize this email scraping automation workflow for finding personal emails instead of generic ones?

Yes, but honestly you should be careful with expectations. This workflow is designed to often return generic addresses like contact@ because those are the easiest to find by scraping. You can adjust the email detection and filtering rules in the “Capture Detected Emails” and “Validate Email Presence” parts, and you can tweak the EmailListVerify request to prioritize different patterns. Common customizations include preferring role-based emails, excluding certain inboxes, or skipping the API call for domains you’ve already processed.

Why is my Google Sheets connection failing in this workflow?

Usually it’s an expired Google authorization in n8n or the sheet moved to a different Drive location. Reconnect the Google Sheets credential, then confirm the selected spreadsheet and worksheet still match the template you copied. Also check sharing permissions if you’re using a team Drive. That’s a sneaky one.

How many websites can this email scraping automation handle?

A few hundred per run is realistic for most small teams.

Is this email scraping automation better than using Zapier or Make?

For this use case, n8n is usually the better fit because you can do conditional routing (scrape first, then API fallback), looping over many rows, and more flexible data handling without paying per tiny step. Zapier and Make can absolutely do it, but the moment you add batching, merging streams, and “only call the paid API if needed,” it gets clunky or expensive. n8n also gives you the option to self-host, which matters when you’re processing lots of domains. If you only need a light 2-step flow, Zapier is fine. If you want this to scale, n8n is the safer bet, and Talk to an automation expert if you want a quick recommendation for your exact volume.

This is the kind of workflow you set up once, then reuse forever. The repetitive scraping and enrichment runs in the background, and your Google Sheet turns into something you can actually send outreach from.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Launch login modal Launch register modal