🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 22, 2026

Apify to Supabase, clean lead lists ready to use

Lisa Granqvist Partner Workflow Automation Expert

You grab leads from Google Maps or LinkedIn, paste them into a sheet, then spend way too long fixing columns, removing duplicates, and figuring out what’s actually usable.

This Apify Supabase automation hits sales ops hardest, but marketers building lists and recruiters sourcing profiles feel it too. The outcome is simple: one clean Supabase table your team can search, filter, and reuse without the cleanup hangover.

Below you’ll see how the workflow collects your criteria via a form, scrapes the right source in Apify, cleans the fields, and stores everything neatly in Supabase.

How This Automation Works

The full n8n workflow, from trigger to final output:

n8n Workflow Template: Apify to Supabase, clean lead lists ready to use

The Problem: Lead scraping turns into cleanup work

Scraping leads sounds fast until you actually try to use the data. One export has phone numbers in three formats. Another has missing websites. LinkedIn results come back with “headline” text that needs trimming, and someone on the team inevitably pastes a second batch right on top of the first. Then you’re stuck doing the boring work: reformatting, reuploading, and double-checking what got missed. It’s not just time. It’s momentum. Every manual step adds another chance to mess up a list you plan to base outreach on.

The friction compounds.

  • Copying exports into spreadsheets usually costs about 30 minutes per lead batch, and that’s before you fix anything.
  • Field names rarely match what your CRM or outreach tool expects, so importing becomes a mini project.
  • Teams can’t reuse “old” leads confidently because nobody trusts how the list was built.
  • When two people scrape the same niche, you get duplicates and mixed formatting that quietly ruins deliverability and personalization.

The Solution: Apify lead scraping routed into clean Supabase tables

This workflow replaces the whole “scrape → export → paste → clean → import” routine with a single intake form and a reliable database destination. Your team submits what they want (industry/title keyword, location, source, and result count). n8n routes the request to the right Apify Actor, so you can scrape Google Maps, LinkedIn, or both depending on the selection. As results come back, the workflow maps the raw fields into a consistent structure using simple “Edit Fields” steps, which is where messy outputs get normalized. Finally, it writes clean rows into Supabase tables (one for Google Maps, one for LinkedIn), ready for searching and downstream use.

The workflow starts with an n8n form submission. From there, a routing step decides which Apify scrape job runs, then the output is cleaned and stored in Supabase so your lead list becomes an asset instead of a one-off export.

What You Get: Automation vs. Results

Example: What This Looks Like

Say you build 3 lead lists a week: two Google Maps pulls and one LinkedIn pull. Manually, a typical cycle looks like 20 minutes to run/export, about 30 minutes to paste and clean fields, then another 10 minutes to import and sanity-check, so roughly 1 hour per list (around 3 hours weekly). With this workflow, submitting the form takes about 2 minutes and the scrape runs in the background; you usually just wait maybe 10–20 minutes and then review the Supabase table. The “hands-on” time drops to roughly 10 minutes a week.

What You’ll Need

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • Apify to run Google Maps and LinkedIn scrapers
  • Supabase to store, search, and reuse lead tables
  • Apify API token + Supabase service role key (from Apify Settings and Supabase Settings → API)

Skill level: Beginner. You’ll paste API keys, test the form URL, and confirm rows appear in Supabase.

Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).

How It Works

A lead intake form triggers everything. Your team fills in what to search for (keyword/industry, location, source, and how many results). Submit. That’s it.

The workflow routes to the right scrape job. A routing step checks the selected source and sends the request to the matching Apify Actor (Google Maps scraper, LinkedIn profile search scraper, or both paths if you want multi-source results).

Raw results get cleaned into a predictable shape. The “Edit Fields” mapping steps take whatever Apify returns and structure it into the columns you actually care about, like name, headline, website, phone, address parts, and other useful metadata.

Supabase becomes the destination system. Clean rows are stored in the correct Supabase table (LinkedIn leads go to the LinkedIn table, Google Maps leads go to the googlemaps table), which means your team can search, filter, and export anytime without re-scraping.

You can easily modify the form fields to capture extra criteria (like job seniority or category) based on your needs. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Form Trigger

This workflow starts when users submit the intake form that collects lead targeting criteria.

  1. Add the Lead Intake Form node as your trigger.
  2. Set Form Title to Targeted leads and Form Description to This form is intended to get leads from Google Maps and LinkedIn using the Apify actor.
  3. Configure fields for Title/Industry, Location, Source (dropdown with Google Maps, LinkedIn, Both), and Number of results.
  4. Connect Lead Intake Form to Source Routing Logic.

Step 2: Connect Apify

Apify runs the LinkedIn and Google Maps scrapers using the form inputs.

  1. Open LinkedIn Scrape Job and set Operation to Run actor and get dataset.
  2. Set Custom Body to ={ "locations": [ "{{ $json.Location }}" ], "maxItems": {{ $json['Number of results'] }}, "profileScraperMode": "Full", "searchQuery": "{{ $json['Title/Industry'] }}" } and Memory to 2048.
  3. Credential Required: Connect your apifyOAuth2Api credentials in LinkedIn Scrape Job.
  4. Open Google Maps Scrape Job and set Operation to Run actor and get dataset.
  5. Set Custom Body to ={ "includeWebResults": false, "language": "en", "locationQuery": "{{ $json.Location }}", "maxCrawledPlacesPerSearch":{{ $json['Number of results'] }} , "maxImages": 0, "maximumLeadsEnrichmentRecords": 0, "scrapeContacts": false, "scrapeDirectories": false, "scrapeImageAuthors": false, "scrapePlaceDetailPage": false, "scrapeReviewsPersonalData": true, "scrapeTableReservationProvider": false, "searchStringsArray": [ "{{ $json['Title/Industry'] }}" ], "skipClosedPlaces": false }.
  6. Credential Required: Connect your apifyOAuth2Api credentials in Google Maps Scrape Job.
Tip: The Apify actors referenced in LinkedIn Scrape Job and Google Maps Scrape Job must be accessible in your Apify account; verify their availability before testing.

Step 3: Set Up Routing and Field Mapping

Route each form submission to the correct scraper and map data into a consistent structure before storing.

  1. In Source Routing Logic, create rules for GoogleMaps, LinkedIn, and Both using Left Value ={{ $json.Source }} and Right Value of Google Maps, LinkedIn, and Both.
  2. Connect the GoogleMaps output to Google Maps Scrape Job and the LinkedIn output to LinkedIn Scrape Job.
  3. For the Both output, connect Source Routing Logic to both LinkedIn Scrape Job and Google Maps Scrape Job in parallel.
  4. In Map LinkedIn Fields, map name to ={{ $json.firstName }} {{ $json.lastName }} and keep other fields like publicIdentifier, headline, and connectionsCount using their expressions.
  5. In Map GMaps Fields, set location to ="lat":{{ $json.location.lat }}, "lng":{{ $json.location.lng }} and map the rest of the fields from the scraper output.
⚠️ Common Pitfall: The Source Routing Logic checks for exact string matches. Ensure the form’s Source options exactly match Google Maps, LinkedIn, and Both.

Step 4: Configure Supabase Storage

Store normalized LinkedIn and Google Maps records in separate Supabase tables.

  1. Open Store LinkedIn Records and set Table ID to linkedin.
  2. Map fields like publicidentifier to ={{ $json.publicIdentifier }}, name to ={{ $json.name }}, and latest_experience to ={{ $json.latest_experience }}.
  3. Credential Required: Connect your supabaseApi credentials in Store LinkedIn Records.
  4. Open Store GMaps Records and set Table ID to googlemaps.
  5. Map fields like title to ={{ $json.title }}, postal_code to ={{ $json.postalCode }}, and total_score to ={{ $json.totalScore }}.
  6. Credential Required: Connect your supabaseApi credentials in Store GMaps Records.

Step 5: Test and Activate Your Workflow

Run a live test to ensure routing, scraping, and storage are working end-to-end.

  1. Click Test Workflow and submit the Lead Intake Form with a valid Title/Industry, Location, and Source.
  2. Confirm that Source Routing Logic sends the data to the correct scraper and that LinkedIn Scrape Job or Google Maps Scrape Job returns dataset items.
  3. Verify that Map LinkedIn Fields and Map GMaps Fields output properly structured records.
  4. Check your Supabase tables linkedin and googlemaps for new records inserted by Store LinkedIn Records and Store GMaps Records.
  5. Once successful, switch the workflow to Active to accept live form submissions.
🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Common Gotchas

  • Apify credentials can expire or your token might lack access to the Actors. If things break, check Apify Settings → Integrations → API tokens first.
  • If your Apify scrape takes longer than expected, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
  • Supabase inserts fail quietly when the table schema doesn’t match your mapped fields. If you edited columns, confirm the Table Editor column names still match what your “Map Fields” steps output.

Frequently Asked Questions

How long does it take to set up this Apify Supabase automation automation?

Usually about 30 minutes once your Apify and Supabase accounts are ready.

Do I need coding skills to automate Apify Supabase automation?

No. You’ll connect credentials and test the form once. The workflow handles the rest.

Is n8n free to use for this Apify Supabase automation workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Apify usage credits (your scraping volume drives the cost).

Where can I host n8n to run this automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

Can I customize this Apify Supabase automation workflow for saving both Google Maps and LinkedIn into one table?

Yes, but you’ll want to standardize the fields first. You can keep the existing routing logic and change the “Store LinkedIn Records” and “Store GMaps Records” steps to insert into one Supabase table, then adjust “Map LinkedIn Fields” and “Map GMaps Fields” so both paths output the same column set (even if some values are blank). A common tweak is adding a “source” column so you can filter later. If you also care about de-duplication, add an “If” check before insert using website, phone, or LinkedIn public identifier.

Why is my Supabase connection failing in this workflow?

Most of the time it’s the wrong key. This workflow expects the Supabase service role key (not the public anon key) plus the correct project URL. Also check that the table names match exactly (for example, “googlemaps”) and that Row Level Security policies aren’t blocking inserts for the API key you’re using.

How many leads can this Apify Supabase automation automation handle?

A lot, as long as your Apify plan and n8n execution limits allow it. On n8n Cloud, your monthly executions depend on your plan; on self-hosted n8n there’s no execution cap (your server is the limit). Practically, most teams run batches of a few hundred leads per search and schedule bigger pulls off-hours so they don’t hit scraping limits.

Is this Apify Supabase automation automation better than using Zapier or Make?

Often, yes, because this use case benefits from branching and data shaping. n8n’s routing makes “Google Maps vs LinkedIn vs both” clean, and you can self-host if you expect lots of executions without paying per task. You also get more control over how fields are mapped before they hit your database, which is where list quality usually wins or loses. Zapier or Make can be fine for simpler flows, but scraping jobs and table inserts can get fiddly fast. Talk to an automation expert if you want a quick recommendation for your volume and tools.

If lead generation matters, the “cleanup phase” can’t be the price you pay every time. Set this up once, then keep building a clean, searchable lead library in Supabase.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Launch login modal Launch register modal