Apify to Supabase, clean lead lists ready to use
You grab leads from Google Maps or LinkedIn, paste them into a sheet, then spend way too long fixing columns, removing duplicates, and figuring out what’s actually usable.
This Apify Supabase automation hits sales ops hardest, but marketers building lists and recruiters sourcing profiles feel it too. The outcome is simple: one clean Supabase table your team can search, filter, and reuse without the cleanup hangover.
Below you’ll see how the workflow collects your criteria via a form, scrapes the right source in Apify, cleans the fields, and stores everything neatly in Supabase.
How This Automation Works
The full n8n workflow, from trigger to final output:
n8n Workflow Template: Apify to Supabase, clean lead lists ready to use
flowchart LR
subgraph sg0["Lead Intake Form Flow"]
direction LR
n0["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/supabase.svg' width='40' height='40' /></div><br/>Store LinkedIn Records"]
n1["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/supabase.svg' width='40' height='40' /></div><br/>Store GMaps Records"]
n2@{ icon: "mdi:swap-vertical", form: "rounded", label: "Map LinkedIn Fields", pos: "b", h: 48 }
n3@{ icon: "mdi:swap-vertical", form: "rounded", label: "Map GMaps Fields", pos: "b", h: 48 }
n4@{ icon: "mdi:swap-horizontal", form: "rounded", label: "LinkedIn Scrape Job", pos: "b", h: 48 }
n5@{ icon: "mdi:swap-horizontal", form: "rounded", label: "Google Maps Scrape Job", pos: "b", h: 48 }
n6@{ icon: "mdi:swap-horizontal", form: "rounded", label: "Source Routing Logic", pos: "b", h: 48 }
n7["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/form.svg' width='40' height='40' /></div><br/>Lead Intake Form"]
n6 --> n5
n6 --> n4
n2 --> n0
n4 --> n2
n7 --> n6
n5 --> n3
n3 --> n1
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n7 trigger
class n4,n5,n6 decision
classDef customIcon fill:none,stroke:none
class n0,n1,n7 customIcon
The Problem: Lead scraping turns into cleanup work
Scraping leads sounds fast until you actually try to use the data. One export has phone numbers in three formats. Another has missing websites. LinkedIn results come back with “headline” text that needs trimming, and someone on the team inevitably pastes a second batch right on top of the first. Then you’re stuck doing the boring work: reformatting, reuploading, and double-checking what got missed. It’s not just time. It’s momentum. Every manual step adds another chance to mess up a list you plan to base outreach on.
The friction compounds.
- Copying exports into spreadsheets usually costs about 30 minutes per lead batch, and that’s before you fix anything.
- Field names rarely match what your CRM or outreach tool expects, so importing becomes a mini project.
- Teams can’t reuse “old” leads confidently because nobody trusts how the list was built.
- When two people scrape the same niche, you get duplicates and mixed formatting that quietly ruins deliverability and personalization.
The Solution: Apify lead scraping routed into clean Supabase tables
This workflow replaces the whole “scrape → export → paste → clean → import” routine with a single intake form and a reliable database destination. Your team submits what they want (industry/title keyword, location, source, and result count). n8n routes the request to the right Apify Actor, so you can scrape Google Maps, LinkedIn, or both depending on the selection. As results come back, the workflow maps the raw fields into a consistent structure using simple “Edit Fields” steps, which is where messy outputs get normalized. Finally, it writes clean rows into Supabase tables (one for Google Maps, one for LinkedIn), ready for searching and downstream use.
The workflow starts with an n8n form submission. From there, a routing step decides which Apify scrape job runs, then the output is cleaned and stored in Supabase so your lead list becomes an asset instead of a one-off export.
What You Get: Automation vs. Results
| What This Workflow Automates | Results You’ll Get |
|---|---|
|
|
Example: What This Looks Like
Say you build 3 lead lists a week: two Google Maps pulls and one LinkedIn pull. Manually, a typical cycle looks like 20 minutes to run/export, about 30 minutes to paste and clean fields, then another 10 minutes to import and sanity-check, so roughly 1 hour per list (around 3 hours weekly). With this workflow, submitting the form takes about 2 minutes and the scrape runs in the background; you usually just wait maybe 10–20 minutes and then review the Supabase table. The “hands-on” time drops to roughly 10 minutes a week.
What You’ll Need
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Apify to run Google Maps and LinkedIn scrapers
- Supabase to store, search, and reuse lead tables
- Apify API token + Supabase service role key (from Apify Settings and Supabase Settings → API)
Skill level: Beginner. You’ll paste API keys, test the form URL, and confirm rows appear in Supabase.
Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).
How It Works
A lead intake form triggers everything. Your team fills in what to search for (keyword/industry, location, source, and how many results). Submit. That’s it.
The workflow routes to the right scrape job. A routing step checks the selected source and sends the request to the matching Apify Actor (Google Maps scraper, LinkedIn profile search scraper, or both paths if you want multi-source results).
Raw results get cleaned into a predictable shape. The “Edit Fields” mapping steps take whatever Apify returns and structure it into the columns you actually care about, like name, headline, website, phone, address parts, and other useful metadata.
Supabase becomes the destination system. Clean rows are stored in the correct Supabase table (LinkedIn leads go to the LinkedIn table, Google Maps leads go to the googlemaps table), which means your team can search, filter, and export anytime without re-scraping.
You can easily modify the form fields to capture extra criteria (like job seniority or category) based on your needs. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Form Trigger
This workflow starts when users submit the intake form that collects lead targeting criteria.
- Add the Lead Intake Form node as your trigger.
- Set Form Title to
Targeted leadsand Form Description toThis form is intended to get leads from Google Maps and LinkedIn using the Apify actor. - Configure fields for Title/Industry, Location, Source (dropdown with
Google Maps,LinkedIn,Both), and Number of results. - Connect Lead Intake Form to Source Routing Logic.
Step 2: Connect Apify
Apify runs the LinkedIn and Google Maps scrapers using the form inputs.
- Open LinkedIn Scrape Job and set Operation to
Run actor and get dataset. - Set Custom Body to
={ "locations": [ "{{ $json.Location }}" ], "maxItems": {{ $json['Number of results'] }}, "profileScraperMode": "Full", "searchQuery": "{{ $json['Title/Industry'] }}" }and Memory to2048. - Credential Required: Connect your
apifyOAuth2Apicredentials in LinkedIn Scrape Job. - Open Google Maps Scrape Job and set Operation to
Run actor and get dataset. - Set Custom Body to
={ "includeWebResults": false, "language": "en", "locationQuery": "{{ $json.Location }}", "maxCrawledPlacesPerSearch":{{ $json['Number of results'] }} , "maxImages": 0, "maximumLeadsEnrichmentRecords": 0, "scrapeContacts": false, "scrapeDirectories": false, "scrapeImageAuthors": false, "scrapePlaceDetailPage": false, "scrapeReviewsPersonalData": true, "scrapeTableReservationProvider": false, "searchStringsArray": [ "{{ $json['Title/Industry'] }}" ], "skipClosedPlaces": false }. - Credential Required: Connect your
apifyOAuth2Apicredentials in Google Maps Scrape Job.
Step 3: Set Up Routing and Field Mapping
Route each form submission to the correct scraper and map data into a consistent structure before storing.
- In Source Routing Logic, create rules for GoogleMaps, LinkedIn, and Both using Left Value
={{ $json.Source }}and Right Value ofGoogle Maps,LinkedIn, andBoth. - Connect the GoogleMaps output to Google Maps Scrape Job and the LinkedIn output to LinkedIn Scrape Job.
- For the Both output, connect Source Routing Logic to both LinkedIn Scrape Job and Google Maps Scrape Job in parallel.
- In Map LinkedIn Fields, map name to
={{ $json.firstName }} {{ $json.lastName }}and keep other fields like publicIdentifier, headline, and connectionsCount using their expressions. - In Map GMaps Fields, set location to
="lat":{{ $json.location.lat }}, "lng":{{ $json.location.lng }}and map the rest of the fields from the scraper output.
Google Maps, LinkedIn, and Both.Step 4: Configure Supabase Storage
Store normalized LinkedIn and Google Maps records in separate Supabase tables.
- Open Store LinkedIn Records and set Table ID to
linkedin. - Map fields like publicidentifier to
={{ $json.publicIdentifier }}, name to={{ $json.name }}, and latest_experience to={{ $json.latest_experience }}. - Credential Required: Connect your
supabaseApicredentials in Store LinkedIn Records. - Open Store GMaps Records and set Table ID to
googlemaps. - Map fields like title to
={{ $json.title }}, postal_code to={{ $json.postalCode }}, and total_score to={{ $json.totalScore }}. - Credential Required: Connect your
supabaseApicredentials in Store GMaps Records.
Step 5: Test and Activate Your Workflow
Run a live test to ensure routing, scraping, and storage are working end-to-end.
- Click Test Workflow and submit the Lead Intake Form with a valid Title/Industry, Location, and Source.
- Confirm that Source Routing Logic sends the data to the correct scraper and that LinkedIn Scrape Job or Google Maps Scrape Job returns dataset items.
- Verify that Map LinkedIn Fields and Map GMaps Fields output properly structured records.
- Check your Supabase tables
linkedinandgooglemapsfor new records inserted by Store LinkedIn Records and Store GMaps Records. - Once successful, switch the workflow to Active to accept live form submissions.
Common Gotchas
- Apify credentials can expire or your token might lack access to the Actors. If things break, check Apify Settings → Integrations → API tokens first.
- If your Apify scrape takes longer than expected, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- Supabase inserts fail quietly when the table schema doesn’t match your mapped fields. If you edited columns, confirm the Table Editor column names still match what your “Map Fields” steps output.
Frequently Asked Questions
Usually about 30 minutes once your Apify and Supabase accounts are ready.
No. You’ll connect credentials and test the form once. The workflow handles the rest.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Apify usage credits (your scraping volume drives the cost).
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Yes, but you’ll want to standardize the fields first. You can keep the existing routing logic and change the “Store LinkedIn Records” and “Store GMaps Records” steps to insert into one Supabase table, then adjust “Map LinkedIn Fields” and “Map GMaps Fields” so both paths output the same column set (even if some values are blank). A common tweak is adding a “source” column so you can filter later. If you also care about de-duplication, add an “If” check before insert using website, phone, or LinkedIn public identifier.
Most of the time it’s the wrong key. This workflow expects the Supabase service role key (not the public anon key) plus the correct project URL. Also check that the table names match exactly (for example, “googlemaps”) and that Row Level Security policies aren’t blocking inserts for the API key you’re using.
A lot, as long as your Apify plan and n8n execution limits allow it. On n8n Cloud, your monthly executions depend on your plan; on self-hosted n8n there’s no execution cap (your server is the limit). Practically, most teams run batches of a few hundred leads per search and schedule bigger pulls off-hours so they don’t hit scraping limits.
Often, yes, because this use case benefits from branching and data shaping. n8n’s routing makes “Google Maps vs LinkedIn vs both” clean, and you can self-host if you expect lots of executions without paying per task. You also get more control over how fields are mapped before they hit your database, which is where list quality usually wins or loses. Zapier or Make can be fine for simpler flows, but scraping jobs and table inserts can get fiddly fast. Talk to an automation expert if you want a quick recommendation for your volume and tools.
If lead generation matters, the “cleanup phase” can’t be the price you pay every time. Set this up once, then keep building a clean, searchable lead library in Supabase.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.