🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 22, 2026

LinkedIn to Google Gemini, clean resumes ready to use

Lisa Granqvist Partner Workflow Automation Expert

Copying a LinkedIn profile into a document sounds quick. Then you hit broken formatting, missing dates, weird line breaks, and “skills” sprinkled across three different sections.

Recruiters feel it when they’re trying to screen faster. HR managers feel it when they need consistency across candidates. And if you run a small talent team, LinkedIn resume automation is the difference between a clean pipeline and a messy one.

This workflow takes one LinkedIn URL, pulls the profile reliably, and turns it into a standards-friendly JSON Resume you can actually use. You’ll see how it works, what you need, and what you can customize.

How This Automation Works

The full n8n workflow, from trigger to final output:

n8n Workflow Template: LinkedIn to Google Gemini, clean resumes ready to use

The Problem: LinkedIn Profiles Aren’t Resume-Ready

LinkedIn is great for humans and frustrating for systems. The information you need is there, but it’s spread across “About,” experience entries, featured links, and skills that don’t map neatly to your ATS fields. So you end up doing the same dance: copy, paste, clean, reformat, and still miss something important. Multiply that by a hiring sprint, and you’re burning hours on admin work instead of candidate evaluation. Worse, manual cleanup creates inconsistency, which quietly breaks downstream reporting and scoring.

It adds up fast. Here’s where the friction usually shows up.

  • Profiles copy over as messy blocks of text, so you waste time reformatting before anyone can review them.
  • Anti-bot and CAPTCHA issues make “quick scraping” unreliable, which means the process collapses right when volume spikes.
  • Skills and job history end up interpreted differently by each person, so “consistent screening” becomes a nice idea, not reality.
  • You can’t easily pass a cleaned profile into other tools because it’s not structured data, it’s a patchwork document.

The Solution: LinkedIn URL → JSON Resume (Automatically)

This n8n workflow turns a LinkedIn profile URL into a clean, structured JSON Resume using Bright Data for the fetch and Google Gemini for the extraction. You start by supplying the LinkedIn URL plus a couple of required settings (like your Bright Data zone and where you want the output sent). Bright Data’s Web Unlocker pulls the profile content without the usual scraping headaches. Then Gemini converts the raw page content into clean text, extracts skills, and builds a JSON Resume that follows a consistent schema. Finally, the workflow sends a webhook notification and also saves the resume (and skills) as files so you can reuse them anywhere.

The workflow kicks off from a manual trigger, which is perfect for testing or running on-demand for specific candidates. From there it gathers the profile data, uses Gemini to transform it into structured fields, and outputs a ready-to-store JSON resume plus a separate skills artifact. No copy-paste. No “did we capture the dates?” anxiety.

What You Get: Automation vs. Results

Example: What This Looks Like

Say you screen 20 candidates from LinkedIn in a day. Manually, if you spend about 15 minutes per profile to copy, clean, and normalize sections, that’s roughly 5 hours of pure formatting work. With this workflow, you drop in a LinkedIn URL, wait for Bright Data + Gemini to produce the JSON Resume, and you’re done; call it about 5–10 minutes per candidate including processing. That’s several hours back on any day you’re doing real volume.

What You’ll Need

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • Bright Data Web Unlocker for CAPTCHA-free LinkedIn scraping
  • Google Gemini API to extract structured resume fields
  • Bright Data Web Unlocker token (get it from your Bright Data zone settings)

Skill level: Intermediate. You will connect credentials, paste API keys, and edit a few “Set” fields like the profile URL and webhook destination.

Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).

How It Works

You provide the LinkedIn URL and settings. A manual trigger starts the run, then a setup step assigns the target profile URL, your Bright Data zone, and the webhook destination for the finished resume.

The profile is fetched through Bright Data. An HTTP request pulls the profile content via Web Unlocker, which is built to handle bot protections that break typical scrapers.

Gemini cleans and extracts the information. The workflow converts the pulled content into usable text, then Gemini-based extractors produce two structured outputs: a skill set and a JSON Resume that maps cleanly to consistent fields.

Results are saved and sent onward. n8n builds a file payload, writes the JSON resume and skills to disk, and sends a webhook notification so other systems can ingest it right away.

You can easily modify the output destination to push into an ATS or CRM instead of writing files locally. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Manual Trigger

This workflow starts manually, so you can run the resume generation on demand.

  1. Add and keep Manual Launch Trigger as the first node in the canvas.
  2. Connect Manual Launch Trigger to Assign Source Settings to begin the data flow.

Step 2: Connect External Data Request

Configure source parameters and fetch the profile markdown from the external API.

  1. In Assign Source Settings, set url to https://www.linkedin.com/in/[YOUR_ID], zone to web_unlocker1, and webhook_notification_url to https://webhook.site/[YOUR_ID].
  2. Open External Data Request and set URL to https://api.brightdata.com/request and Method to POST.
  3. Enable Send Body and set body parameters: zone to {{ $json.zone }}, url to {{ $json.url }}, format to raw, and data_format to markdown.
  4. Credential Required: Connect your httpHeaderAuth credentials in External Data Request.

Convert Markdown to Text uses the markdown returned from the request and prepares it for AI extraction.

Step 3: Set Up the Markdown-to-Text AI Processing

Use the Gemini model to clean and normalize the markdown into plain text.

  1. In Convert Markdown to Text, set Text to =You need to analyze the below markdown and convert to textual data. Please do not output with your own thoughts. Make sure to output with textual data only with no links, scripts, css etc. {{ $json.data }}.
  2. Set the message to You are a markdown expert in the messages section.
  3. Open Gemini Markdown Model and keep Model Name as models/gemini-2.0-flash-exp.
  4. Credential Required: Connect your googlePalmApi credentials in Gemini Markdown Model. This model is connected as the language model for Convert Markdown to Text.

Convert Markdown to Text outputs to both Skill Data Miner and JSON Resume Builder in parallel.

Step 4: Extract Skills and Build JSON Resume (Parallel Branches)

Two extraction paths run simultaneously: one for skill mining and one for a full JSON resume.

  1. In Skill Data Miner, set Text to =Perform Data Mining and extract the skills from the provided resume {{ $json.text }} and keep the provided inputSchema JSON for skills.
  2. Open Gemini Skill Model and keep Model Name as models/gemini-2.0-flash-exp.
  3. Credential Required: Connect your googlePalmApi credentials in Gemini Skill Model. This model is connected as the language model for Skill Data Miner.
  4. In JSON Resume Builder, set Text to =Extract the resume in JSON format. {{ $json.text }} and keep the provided inputSchema JSON for the resume.
  5. Open Gemini Resume Model and keep Model Name as models/gemini-2.0-flash-exp.
  6. Credential Required: Connect your googlePalmApi credentials in Gemini Resume Model. This model is connected as the language model for JSON Resume Builder.

JSON Resume Builder feeds into Transform Output Script, while Skill Data Miner continues to the skills file branch.

Step 5: Build Output Files and Webhook Notice

Format the outputs, write files, and send a webhook notice once the resume JSON is ready.

  1. In Transform Output Script, set JavaScript Code to return $input.first().json.output.
  2. Transform Output Script outputs to both Send Webhook Notice and Build Binary Payload in parallel.
  3. In Send Webhook Notice, set URL to {{ $('Assign Source Settings').item.json.webhook_notification_url }} and body field json_resume to {{ $('JSON Resume Builder').item.json.output.toJsonString() }}.
  4. In Build Binary Payload, keep the function code that base64-encodes the JSON for file writing.
  5. In Save Structured File, set File Name to =d:\Json_Resume.json and Operation to write.
  6. In Build Skill Binary, keep the function code that base64-encodes the skills JSON.
  7. In Save Skills File, set File Name to =d:\Resume_Skills.json and Operation to write.

⚠️ Common Pitfall: Ensure the file paths used by Save Structured File and Save Skills File are valid on your n8n host machine, otherwise the write operation will fail.

Step 6: Test and Activate Your Workflow

Run a full test to confirm the resume and skills files are created and the webhook is notified.

  1. Click Execute Workflow and verify External Data Request returns markdown data.
  2. Confirm Convert Markdown to Text produces clean text, and both Skill Data Miner and JSON Resume Builder run successfully in parallel.
  3. Check the filesystem for d:\Json_Resume.json and d:\Resume_Skills.json created by Save Structured File and Save Skills File.
  4. Verify the webhook target receives the json_resume payload from Send Webhook Notice.
  5. When results are correct, switch the workflow to Active to use it in production.
🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Common Gotchas

  • Bright Data credentials can expire or need specific permissions. If things break, check your Web Unlocker token and zone access in Bright Data first.
  • If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
  • Default prompts in AI nodes are generic. Add your brand voice early or you’ll be editing outputs forever.

Frequently Asked Questions

How long does it take to set up this LinkedIn resume automation automation?

About 30 minutes if you already have your Bright Data and Gemini keys.

Do I need coding skills to automate LinkedIn resume automation?

No. You’ll mostly paste credentials and edit a few fields like the LinkedIn URL and webhook destination.

Is n8n free to use for this LinkedIn resume automation workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Bright Data and Google Gemini API usage costs.

Where can I host n8n to run this automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

Can I customize this LinkedIn resume automation workflow for multilingual profiles?

Yes, and it’s a practical upgrade. Add a translation LLM step right after the “Convert Markdown to Text” stage, then feed the translated text into the skill extractor and the JSON Resume builder. You can also adjust the resume schema mapping if you want localized field names, but most teams keep the JSON keys stable and translate only the content. If you prefer OpenAI, you can swap the Gemini model used by the extraction steps for an OpenAI Chat Model node.

Why is my Bright Data connection failing in this workflow?

Usually it’s an expired token or the wrong Bright Data zone. Regenerate the Web Unlocker token in Bright Data, then update the Header Authentication credential in n8n. If it still fails, check whether your plan allows Web Unlocker requests and whether LinkedIn is returning a blocked response for that specific URL.

How many profiles can this LinkedIn resume automation automation handle?

It depends on your n8n plan and how fast Bright Data and Gemini respond, but most teams comfortably process dozens of profiles per day with this setup.

Is this LinkedIn resume automation automation better than using Zapier or Make?

Often, yes, because this is not a simple “A to B” zap. You’re doing scraping, cleanup, LLM extraction, file creation, and webhook delivery, which is where n8n’s flexibility really helps. Self-hosting is also a big deal if you plan to run high volume without counting every task. Zapier or Make can still work if you keep it small and you don’t need advanced branching or file handling. Frankly, the bigger question is governance: where do you want the data to live, and who needs access? Talk to an automation expert if you want help deciding.

Once this is running, you stop “rewriting LinkedIn into a resume” and start collecting clean candidate data on demand. Set it up once, and let the workflow do the boring part.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Launch login modal Launch register modal