🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 22, 2026

Scrape.do to Google Sheets, SERP rankings logged clean

Lisa Granqvist Partner Workflow Automation Expert

Keyword tracking sounds simple until you’re staring at a dozen Google results pages, copying titles and URLs into a spreadsheet, and praying you didn’t paste the wrong row. It’s slow. It’s messy. And it’s the kind of task that quietly eats your week.

This SERP rankings automation hits SEO specialists the hardest, but marketers building reports and agency leads juggling client snapshots feel it too. You end up with consistent, comparable SERP data in Google Sheets without the tab-hopping.

Below you’ll see exactly what the workflow does, the outcomes you can expect, and how to tailor it for your keywords, countries, and reporting style.

How This Automation Works

The full n8n workflow, from trigger to final output:

n8n Workflow Template: Scrape.do to Google Sheets, SERP rankings logged clean

The Problem: SERP tracking turns into spreadsheet busywork

If you’ve ever tried to do “quick competitor research” manually, you know how it goes. You search a keyword, open the top results, copy titles, paste URLs, then repeat for the next keyword and the next country. Two hours later you have a half-finished sheet, duplicate domains, a few broken links, and no confidence you could repeat the same method next week. The worst part is the mental overhead. You’re doing clerical work when you should be interpreting what the SERP is telling you.

The friction compounds, especially once reporting becomes a routine.

  • Manually checking the top 10 results across multiple keywords can burn an entire afternoon.
  • Small copy/paste mistakes make trend reporting unreliable, which means you end up double-checking everything.
  • Google results change by country, so “I’ll just spot check the US” quickly turns into five separate processes.
  • When the method isn’t repeatable, client reporting and internal SEO updates become a guessing game.

The Solution: Scrape.do SERPs into Google Sheets automatically

This n8n workflow takes your keyword list (plus target countries) from Google Sheets, pulls the live Google SERP HTML through the Scrape.do SERP API, extracts the top results, and writes clean rows back into a Results tab. You start with a simple input sheet. Then, for each keyword, the workflow encodes the search term, makes the API request, and parses out structured competitor fields like ranking position, page title, URL, and the SERP snippet text. Finally, everything gets appended into Google Sheets so you can filter, pivot, chart, or plug it into your reporting.

The workflow starts with a manual run inside n8n. From there it reads your “Keywords” tab, calls Scrape.do to fetch the Google results, and converts raw HTML into a neat table of the top 10 competitors. At the end, the “Results” tab becomes your running SERP log.

What You Get: Automation vs. Results

Example: What This Looks Like

Say you track 20 keywords across 3 countries, and you log the top 10 results each time. Manually, even a “quick” process (about 2 minutes per SERP result once you copy title, URL, and snippet) becomes roughly 1,200 rows, which is easily 6 to 8 hours of mind-numbing work. With this workflow, you paste the keywords into your sheet, run the workflow, and let it process in the background. A single keyword usually takes about 10–20 seconds, so you’re waiting on the automation, not doing the labor.

What You’ll Need

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • Google Sheets for keyword input and results storage
  • Scrape.do SERP API to fetch Google results HTML
  • Scrape.do API token (get it from your Scrape.do dashboard)

Skill level: Intermediate. You’ll connect credentials, confirm sheet tabs/columns, and paste an API token.

Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).

How It Works

You run it manually in n8n. That’s useful for research workflows because you can trigger it when you’re ready (weekly, monthly, before a client call).

Keywords and target countries come from Google Sheets. The workflow reads your “Keywords” tab, so your team can update inputs without touching n8n.

Scrape.do pulls the live SERP, then the workflow cleans it up. The keyword is URL-encoded, sent to the Scrape.do endpoint, and the returned HTML is parsed into structured competitor rows (top results, not a blob of markup).

Results get appended to your “Results” tab. You end up with a growing dataset that you can filter by keyword, compare countries, and use for repeatable reporting.

You can easily modify the output fields to include extra SERP details or save into separate tabs per country based on your needs. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Manual Trigger

Start the workflow manually so you can validate the data pull and parsing before running it in production.

  1. Add and open Manual Launch Trigger.
  2. Leave the default settings as-is since this trigger requires no parameters.

Step 2: Connect Google Sheets

Pull keyword inputs from a spreadsheet and later append parsed competitor results to another tab.

  1. Open Retrieve Keyword List and select your spreadsheet.
  2. Set Document to Competitor tracker and Sheet to Keywords.
  3. Credential Required: Connect your googleSheetsOAuth2Api credentials.
  4. Open Append Output Rows and set Operation to append.
  5. Set Document to Competitor tracker and Sheet to Competitor Results.
  6. Credential Required: Connect your googleSheetsOAuth2Api credentials (this node has no credentials configured yet).

⚠️ Common Pitfall: Append Output Rows will fail without Google Sheets credentials even if Retrieve Keyword List is already authenticated.

Step 3: Set Up Encoding and Request Logic

Encode keywords for URLs, then call Scrape.do to fetch Google SERP HTML for each keyword.

  1. Open Encode Search Terms and keep the JavaScript code as provided to URL-encode the Keyword field.
  2. Open Request Search Results and set URL to =https://api.scrape.do/?token={{$vars.SCRAPEDO_TOKEN}}&url={{ encodeURIComponent('https://www.google.com/search?q=' + $json.Keyword) }}&geoCode={{ $json['Target Country'] || 'us' }}&render=true.
  3. Enable Send Headers and add the Accept header with value text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8.

Tip: Add a workflow variable named SCRAPEDO_TOKEN in n8n so the expression in Request Search Results can resolve your API token.

Step 4: Parse and Output Competitor Results

Extract top organic results from the HTML and store them in your output sheet.

  1. Open Parse Rival Data and keep the provided JavaScript to extract titles, URLs, descriptions, and positions.
  2. Confirm that Parse Rival Data outputs fields like position, websiteTitle, websiteUrl, websiteDescription, and keyword.
  3. In Append Output Rows, keep Mapping Mode set to autoMapInputData so fields map automatically to your sheet columns.

Step 5: Test and Activate Your Workflow

Run a manual test to verify that keywords are fetched, requests are sent, and competitor results are appended.

  1. Click Execute Workflow from Manual Launch Trigger to run the full chain: Manual Launch TriggerRetrieve Keyword ListEncode Search TermsRequest Search ResultsParse Rival DataAppend Output Rows.
  2. Verify that Append Output Rows adds rows to the Competitor Results sheet.
  3. If the output is correct, toggle the workflow to Active to use it in production runs.

Tip: If no results appear, inspect the HTML returned in Request Search Results to ensure the response contains Google SERP markup.

🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Common Gotchas

  • Google Sheets credentials can expire or need specific permissions. If things break, check the Google Sheets OAuth connection inside n8n’s Credentials area first.
  • If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
  • Scrape.do requests can fail when API credits run out or the token is wrong. Check your Scrape.do dashboard, then update the token in the HTTP Request node.

Frequently Asked Questions

How long does it take to set up this SERP rankings automation?

About 30 minutes if your sheet and Scrape.do token are ready.

Do I need coding skills to automate SERP rankings logging?

No. You’ll mainly connect Google Sheets and paste your Scrape.do API token. The parsing logic is already in the workflow.

Is n8n free to use for this SERP rankings automation workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Scrape.do API usage based on how many keywords you run.

Where can I host n8n to run this automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

Can I customize this SERP rankings automation workflow for tracking more than the top 10 results?

Yes, but you’ll need to edit the parsing step. In n8n, update the code in the “Parse Rival Data” node to capture more than 10 organic results, then append the extra rows through the Google Sheets “Append Output Rows” step. Common customizations include adding the domain field, excluding specific websites, and writing results to separate tabs per country.

Why is my Scrape.do connection failing in this workflow?

Usually it’s a bad or expired token, or you’ve run out of API credits. Update the token in the HTTP Request node, then test a single keyword to confirm the response returns HTML. If you’re using render settings (like full rendering), response times can increase, so timeouts are also worth checking in n8n.

How many keywords can this SERP rankings automation handle?

A batch of 10 keywords usually finishes in a few minutes, and 50+ can take around 20–40 minutes depending on batching and API response times. On n8n Cloud you’re limited by monthly executions on your plan, while self-hosting has no execution limit (it mostly depends on your server and Scrape.do credits). If you need scale, reduce batch size to avoid rate limits and schedule runs during off-hours.

Is this SERP rankings automation better than using Zapier or Make?

For SERP scraping workflows, n8n is usually the better fit because you can control the HTTP request, parsing, and branching without paying extra for every “advanced” step. It’s also easier to keep the logic in one place, especially when you loop through lots of keywords. Zapier or Make can work, but HTML parsing tends to get awkward fast, and multi-step loops can become expensive. If you’re mainly doing simple 2-step automations, they might feel simpler day one. Talk to an automation expert if you want a quick recommendation based on your volume.

Once this is running, SERP snapshots become a button-click, not a recurring chore. Honestly, it’s the kind of automation you set up once and keep using for years.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Launch login modal Launch register modal