🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 22, 2026

Apify + Google Sheets: clean LinkedIn leads fast

Lisa Granqvist Partner Workflow Automation Expert

You finally get a list of LinkedIn profile URLs… and then you lose an afternoon cleaning it. Names don’t line up, job titles are messy, half the rows are missing basics, and you still haven’t started outreach.

This Apify Sheets leads automation hits sales ops and growth teams hardest, but recruiters and agency owners feel it too. Instead of copy-paste and “fix it later,” you get consistent lead rows in Google Sheets that are actually ready to use.

Below you’ll see how the workflow runs, what it replaces, and what to watch out for when LinkedIn data gets finicky.

How This Automation Works

See how this solves the problem:

n8n Workflow Template: Apify + Google Sheets: clean LinkedIn leads fast

The Challenge: Turning LinkedIn URLs Into Usable Lead Rows

LinkedIn URLs are easy to collect and annoying to turn into something useful. You can have 200 profiles sitting in a sheet, but that doesn’t help you segment by title, prioritize by seniority, or even personalize a first line. The “quick fix” is usually manual: open tabs, copy headlines, paste into columns, then realize you pasted the wrong person because two profiles looked similar. Meanwhile, the list goes stale and your follow-up slips another day.

It adds up fast. Here’s where it breaks down.

  • You burn about 2–5 minutes per profile just to grab the basics (name, title, company), and that’s before you sanity-check anything.
  • Your sheet ends up inconsistent, so filtering by role or industry becomes a mini data project.
  • Doing this in bursts leads to errors, like mismatched rows or duplicate profiles that quietly skew your reporting.
  • Most “lead tools” don’t start from your existing URL list, which means you’re re-building lists instead of cleaning them.

The Fix: Apify Scraping + Clean Storage in Google Sheets

This workflow takes the LinkedIn profile URLs you already have in Google Sheets and turns them into structured lead data using Apify. It starts by reading a column of URLs (your input list), then processes them in controlled batches so you don’t hammer APIs or trigger avoidable failures. For each URL, n8n formats the request, calls Apify through HTTP, waits for the scraper run to finish, then pulls the final profile results. Those results are appended back into Google Sheets as clean rows, and the workflow updates a progress log so you can see what happened without guessing. When the run is done, it sends a success email so you’re not babysitting a tab.

The workflow begins when you trigger it (typically via webhook, or on a schedule). From there, Apify does the heavy lifting of scraping and returning profile data. Finally, Google Sheets becomes your clean source of truth, with a progress tracker and an email confirmation when everything finishes.

What Changes: Before vs. After

Real-World Impact

Say you collect 150 LinkedIn profile URLs from event attendees or Sales Navigator. Manually, even 3 minutes per profile is about 7–8 hours of tab-switching and copy-paste. With this workflow, you paste URLs into one Google Sheet column, trigger the run, and wait for Apify to return results while n8n appends rows automatically. You might spend 10 minutes setting it off and spot-checking output, then you move on with your day.

Requirements

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • Google Sheets for storing URLs and clean lead rows
  • Apify to scrape LinkedIn profile data
  • Apify API token (get it from Apify Console → API)

Skill level: Beginner. You’ll connect accounts, choose the right spreadsheet columns, and paste an API token.

Need help implementing this? Talk to an automation expert (free 15-minute consultation).

The Workflow Flow

Webhook or schedule trigger. You kick off the run on demand, or you let it run on a cadence (useful when your team adds new URLs throughout the week).

Google Sheets pulls the input list. The workflow reads your LinkedIn URLs from a specific column (commonly named linkedin_url), then prepares them so each profile is processed cleanly.

Batching + Apify API scraping. n8n loops over URLs in batches, formats a request URL, and calls Apify via HTTP to launch the LinkedIn scraper actor. It then waits for the actor run and fetches the final scraped results when they’re ready.

Append results and notify. Scraped data is appended back into Google Sheets, a progress log is updated, and a completion email is sent through Gmail so you don’t have to keep checking.

You can easily modify which fields you store in Sheets or how big each batch is based on your needs. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Unknown Trigger

This workflow starts from a data source rather than an explicit trigger, so you’ll run it manually or connect your preferred trigger before production use.

  1. Decide how you want to start the workflow (manual run, or add a trigger node later).
  2. Ensure the first node in the execution path is Retrieve LinkedIn Links so the batch loop has input rows.

Step 2: Connect Google Sheets

These nodes read keywords, log progress, and append scraped results to your sheets.

  1. Open Retrieve LinkedIn Links and select Linkedin Post Keywords as the Document and Final keywords as the Sheet. Credential Required: Connect your googleApi credentials.
  2. Open Launch Apify Scraper and set the Document to Linkedin Post Keywords and the Sheet to 24 June 2025. Credential Required: Connect your googleApi credentials.
  3. Open Append to Sheets and confirm Operation is set to append. Map fields as shown: url to {{ $json.url }}, text to {{ $json.text }}, type to {{ $json.type == undefined ? "--" : $json.type }}, title to {{ $json.title }}, inputUrl to {{ $json.inputUrl }}, authorName to {{ $json.authorName }}, postedAtISO to {{ $json.postedAtISO }}, authorProfileUrl to {{ $json.authorProfileUrl }}. Credential Required: Connect your googleApi credentials.
  4. Open Update Progress Log and keep Operation as appendOrUpdate. Map Keywords to {{ $('Batch Scheduler').item.json.Keywords }} and Total Count 24-06-2025 to {{ $('Fetch Scrape Results').all().length }}. Credential Required: Connect your googleApi credentials.

Step 3: Set Up Batch Processing and Query Formatting

These nodes control the batch loop and build LinkedIn search URLs for Apify.

  1. In Batch Scheduler, keep default settings to split input rows into batches.
  2. In Format Query URL, keep the JavaScript code as-is to generate the URL using the Keywords field.
  3. Confirm that Batch Scheduler outputs to both Launch Apify Scraper and Format Query URL in parallel.
  4. In Completion Delay, set Amount to 10 to pause between batches.

Step 4: Configure Apify HTTP Requests

These nodes trigger the Apify actor, wait for completion, and fetch the dataset results.

  1. Open Request Profile Data and set URL to https://api.apify.com/v2/acts/linkedin-scraper/runs, Method to POST, and JSON Body to the provided payload containing {{ $json.url }} and {{ $credentials.linkedinAuth.sessionCookie }}. Credential Required: Connect your httpHeaderAuth credentials.
  2. Open Await Actor Run and set URL to https://api.apify.com/v2/actor-runs/{{ $json.data.id }} with Query Parameter waitForFinish=100. Credential Required: Connect your httpHeaderAuth credentials.
  3. Open Fetch Scrape Results and set URL to https://api.apify.com/v2/datasets/{{ $json.data.defaultDatasetId }}/items. Credential Required: Connect your httpHeaderAuth credentials.

Step 5: Configure Output Email

Send a completion email once the batch launch step executes.

  1. Open Dispatch Success Email and set Send To to {{ $credentials.emailNotification.recipientEmail }}. Credential Required: Connect your gmailOAuth2 credentials.
  2. Set Subject to Apify LinkedIn data details - {{ $now.format('DD MMMM YYYY') }}.
  3. Set Message to the provided HTML, including {{ $now.format('DD MMMM YYYY') }} and {{ $('Launch Apify Scraper').all().length }}.

Step 6: Test and Activate Your Workflow

Run a full test to confirm the batch loop, Apify calls, and sheet updates work end-to-end.

  1. Click Execute Workflow and verify Retrieve LinkedIn Links loads keyword rows.
  2. Confirm Format Query URL outputs a LinkedIn search URL and Request Profile Data returns an Apify run ID.
  3. Check Fetch Scrape Results for dataset items, and verify Append to Sheets writes rows to 24 June 2025.
  4. Confirm Update Progress Log appends or updates counts in Final keywords, and Dispatch Success Email sends the summary.
  5. Activate the workflow when results look correct so it’s ready for production runs.
🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Watch Out For

  • Apify credentials can expire or the actor can require extra permissions. If things break, check your Apify Console token and actor access first.
  • If you’re using Wait nodes or external scraping runs, processing times vary. Bump up the wait duration if downstream HTTP requests fail on empty responses.
  • Default “what fields do we save?” choices can be too generic. Decide your sheet schema early (title, company, location, profile headline) or you’ll be cleaning outputs forever.

Common Questions

How quickly can I implement this Apify Sheets leads automation?

About 30 minutes if your Apify and Google accounts are ready.

Can non-technical teams implement this lead cleaning?

Yes. No coding required, but someone needs to connect Google Sheets and paste an Apify API token into n8n.

Is n8n free to use for this Apify Sheets leads workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Apify usage, which can be a few dollars for small lists and more if you scrape heavily.

Where can I host n8n to run this automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

How do I adapt this Apify Sheets leads solution to my specific challenges?

You can. Most customizations happen where results are written and how items are batched: adjust the fields mapped in “Append to Sheets,” change how often “Update Progress Log” writes status, and tune “Batch Scheduler” to run smaller groups if LinkedIn rate limiting becomes a problem. If you want different Apify actors, swap the actor settings used by the HTTP requests that launch and fetch runs, then keep the same Sheets storage pattern.

Why is my Google Sheets connection failing in this workflow?

Usually it’s expired Google authorization in n8n or the spreadsheet was moved and the ID changed. Reconnect the Google Sheets credential in n8n, then confirm the exact sheet and tab names still match. Also check sharing permissions if the sheet lives in a shared drive.

What’s the capacity of this Apify Sheets leads solution?

It comfortably handles hundreds of URLs per run for most teams, with batching controlling the pace. On self-hosted n8n there’s no execution cap (your server and Apify limits matter most). If you’re on n8n Cloud, capacity depends on your plan’s monthly executions and how many URLs you process in each run.

Is this Apify Sheets leads automation better than using Zapier or Make?

Often, yes, if you’re dealing with batches, waiting for scraper runs, and writing results back reliably. n8n is comfortable with loops and “wait until finished” patterns, and self-hosting can keep costs predictable when volume rises. Zapier or Make can work, but multi-step scraping flows tend to get brittle and expensive as tasks grow. The other benefit is control: you can log progress, retry selectively, and add branching without rebuilding the whole thing. If you’re on the fence, Talk to an automation expert and you’ll get a straight recommendation.

Clean data changes everything, honestly. Set this up once, and your LinkedIn URL dumps turn into usable prospect lists without the spreadsheet chaos.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Launch login modal Launch register modal