ScrapingRobot to Google Sheets, reliable rank tables
Rank tracking sounds simple until you’re doing it by hand. Tabs everywhere. Incognito searches. “Wait, why is this keyword suddenly on page 4?” SERP rank tracking automation fixes the messy part: collecting consistent Google results without the daily copy-paste grind.
SEO managers feel it during weekly reporting. So do marketing leads who need fast answers for stakeholders, and agency operators who cannot afford “we’ll check later.” This workflow turns a recurring task into a repeatable system.
You’ll set up an n8n workflow that pulls Google SERPs from ScrapingRobot, cleans the results, calculates ranks, and sends the output into Google Sheets for reliable tables you can actually trust.
How This Automation Works
Here’s the complete workflow you’ll be setting up:
n8n Workflow Template: ScrapingRobot to Google Sheets, reliable rank tables
flowchart LR
subgraph sg0["When clicking ‘Test workflow’ Flow"]
direction LR
n0@{ icon: "mdi:play-circle", form: "rounded", label: "When clicking ‘Test workflow’", pos: "b", h: 48 }
n1@{ icon: "mdi:swap-vertical", form: "rounded", label: "SERP results", pos: "b", h: 48 }
n2@{ icon: "mdi:swap-vertical", form: "rounded", label: "Separate", pos: "b", h: 48 }
n3@{ icon: "mdi:swap-horizontal", form: "rounded", label: "Title Empty", pos: "b", h: 48 }
n4["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Assign SERP #pos"]
n5["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>GET SERP"]
n6@{ icon: "mdi:cog", form: "rounded", label: "Connect to your own database..", pos: "b", h: 48 }
n7@{ icon: "mdi:swap-vertical", form: "rounded", label: "Set Keywords to get SERPs for", pos: "b", h: 48 }
n8@{ icon: "mdi:cog", form: "rounded", label: "Connect to your own database2", pos: "b", h: 48 }
n9@{ icon: "mdi:swap-vertical", form: "rounded", label: "Assign SearchQuery", pos: "b", h: 48 }
n10@{ icon: "mdi:swap-vertical", form: "rounded", label: "Split out Keywords", pos: "b", h: 48 }
n5 --> n1
n2 --> n3
n1 --> n2
n3 --> n9
n4 --> n8
n9 --> n4
n10 --> n5
n7 --> n10
n6 --> n7
n0 --> n6
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n0 trigger
class n3 decision
class n5 api
class n4 code
classDef customIcon fill:none,stroke:none
class n4,n5 customIcon
Why This Matters: Reliable keyword rank tracking without manual checks
Manual rank tracking breaks in quiet ways. You search a keyword once, see a result, and paste it into a spreadsheet. Next week you do the same thing, but Google shows a slightly different page because location, personalization, and timing changed. Then reporting day arrives and you’re trying to explain “movement” that might not even be real. The worst part is the mental load. You’re spending your best analytical time collecting data instead of interpreting it.
It adds up fast. Here’s where the friction compounds.
- Checking even 30 keywords manually can eat about 2 hours once you include copying URLs, titles, and positions.
- Different people check rankings differently, so the spreadsheet becomes inconsistent and hard to compare week to week.
- Competitor tracking often gets “postponed,” which means you only notice them after they’ve taken your spot.
- When results look off, you end up re-checking everything, and the whole task doubles in time.
What You’ll Build: ScrapingRobot SERP pulls into clean Google Sheets tables
This workflow starts with a list of keywords you want to track (either typed into n8n for a quick test, or pulled from your own keyword database later). For each keyword, it calls the ScrapingRobot API to fetch Google search results and returns structured SERP data like URLs, page titles, and ordering. Then it reshapes that raw response into tidy rows, splits out organic results, and filters out junk entries (like missing titles). Finally, it attaches the original query term to every row and calculates the rank position so your sheet has “Keyword + Rank + Title + URL” ready for reporting. The end result is a consistent rank table you can refresh on demand or schedule.
The workflow kicks off in n8n, loops through your keyword list, and retrieves SERP responses via HTTP. After that, it normalizes fields, extracts organic listings, computes the rank number, then pushes the cleaned rows into Google Sheets (or another output you prefer).
What You’re Building
| What Gets Automated | What You’ll Achieve |
|---|---|
|
|
Expected Results
Say you track 50 keywords and capture the top 10 organic results for each. Manually, even a “quick check” is maybe 2 minutes per keyword once you copy titles, URLs, and positions, which is about 2 hours per run. With this workflow, you paste or load the keyword list, run it, and let ScrapingRobot return the SERPs while n8n writes rows into Google Sheets. Your hands-on time drops to about 10 minutes, and you still get a fuller dataset for competitor analysis.
Before You Start
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- ScrapingRobot for Google SERP data via API.
- Google Sheets to store and share rank tables.
- ScrapingRobot API key (get it from your ScrapingRobot dashboard)
Skill level: Beginner. You’ll mostly paste an API key, choose keyword inputs, and point the output at a sheet.
Want someone to build this for you? Talk to an automation expert (free 15-minute consultation).
Step by Step
A manual run starts it. In the template, you click “Execute workflow” to test. Once you’re happy, you can switch the trigger to a schedule so your SERP rank tracking automation updates daily or weekly.
Keywords get defined and expanded. The workflow sets a list of keywords, then splits them into individual items so each keyword is processed cleanly. If you already keep keywords in a sheet or database, you’ll swap this part to pull from your source instead.
ScrapingRobot fetches the SERPs. n8n sends an HTTP request to the ScrapingRobot “GET SERP” endpoint with your token and keyword query. The response contains organic results (and often extra SERP details) in a structured payload.
Results are cleaned and ranked. The workflow maps fields into a standard format, splits organic results into rows, filters nonempty titles, attaches the keyword, and calculates the rank position so your report doesn’t rely on manual counting.
You can easily modify the keyword source to pull from Google Sheets instead of the built-in keyword list based on your needs. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Manual Trigger
This workflow starts manually, then passes through a placeholder node before keyword setup begins.
- Add the Manual Execution Start node as your trigger.
- Connect Manual Execution Start to Database Placeholder Step to mirror the workflow’s initial flow.
- Keep Database Placeholder Step as a no-op node to allow future database connections before keyword setup.
Step 2: Connect the SERP Data Source
Configure the HTTP request that retrieves Google SERP data for each keyword.
- Add Primary SERP Request and set URL to
https://api.scrapingrobot.com. - Set Method to
POSTand Specify Body toJSON. - Set JSON Body to
{ "url": "https://www.google.com", "module": "GoogleScraper", "params": { "query": "{{ $json["Keyword"] }}" } }. - Enable Send Body and Send Headers.
- Add a header parameter with Name
acceptand Valueapplication/json. - Credential Required: Connect your
httpQueryAuthcredentials (used by Primary SERP Request). - Credential Required: Connect your
httpHeaderAuthcredentials (also configured on Primary SERP Request).
⚠️ Common Pitfall: If the SERP API returns errors, verify the authentication type is set to genericCredentialType with httpQueryAuth and that your header auth matches the provider’s requirements.
Step 3: Define and Expand Keywords
Create the keyword array and split it into individual requests.
- In Define Keyword List, add an assignment with Name
Keyword, Typearray, and Value["constant contact email automation", "business worfklow software", "n8n automation"]. - Connect Database Placeholder Step to Define Keyword List.
- In Expand Keyword Items, set Field To Split Out to
Keyword. - Connect Define Keyword List to Expand Keyword Items, then to Primary SERP Request.
Step 4: Map, Filter, and Enrich SERP Results
Transform the API response, split out organic results, and attach the query term for ranking.
- In Map SERP Output, set organicResults to
{{ $json.result.organicResults }}. - Set peopleAlsoAsk to
{{ $json.result.peopleAlsoAsk }}, searchQuery to{{ $json.result.searchQuery.term }}, and paidAds to{{ $json.result.paidResults }}. - Connect Primary SERP Request to Map SERP Output, then to Split Organic Results.
- In Split Organic Results, set Field To Split Out to
organicResults. - In Filter Nonempty Titles, configure the condition as String → notEmpty with Left Value
{{ $json.title }}. - In Attach Query Term, enable Include Other Fields and set searchQuery to
{{ $('Map SERP Output').item.json.searchQuery }}. - Connect Filter Nonempty Titles to Attach Query Term.
Step 5: Compute SERP Rank and Prepare Output
Assign ranking positions and send results to the output stub.
- In Compute SERP Rank, use the provided JavaScript to group by
searchQueryand addpositionbased on index. - Connect Attach Query Term to Compute SERP Rank.
- Connect Compute SERP Rank to Database Output Stub to hold results for a future database write.
⚠️ Common Pitfall: If ranking positions are off, ensure Attach Query Term includes the correct searchQuery value before it reaches Compute SERP Rank.
Step 6: Test and Activate Your Workflow
Run a manual test to confirm the SERP requests and ranking logic work end-to-end.
- Click Execute Workflow from Manual Execution Start to run the full sequence.
- Confirm Primary SERP Request returns data and Map SERP Output produces
organicResultsandsearchQuery. - Verify Compute SERP Rank outputs items with a numeric
positionfield. - Once verified, activate the workflow for production use and replace Database Output Stub with your actual data storage node.
Troubleshooting Tips
- ScrapingRobot credentials can expire or need specific permissions. If things break, check your ScrapingRobot dashboard token and the HTTP Request node’s “token” parameter first.
- If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- Default prompts in AI nodes are generic. Add your brand voice early or you’ll be editing outputs forever.
Quick Answers
About 30 minutes if you already have your ScrapingRobot key and a target Google Sheet.
No. You’ll paste an API key, choose your keyword source, and map the output to Google Sheets.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in ScrapingRobot API costs based on how many SERP pulls you run.
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Yes, and you should. You can replace the “Define Keyword List” node with a Google Sheets read step, then keep the rest of the workflow the same. Many teams also adjust how many organic results they store (top 10 vs top 50), and add extra columns like “Domain” for faster competitor grouping. If you want summaries, you can plug an OpenAI step in after “Map SERP Output” to label intent or cluster keywords.
Usually it’s the token parameter. Regenerate your ScrapingRobot API key, update it inside the HTTP Request node, then re-run a single keyword to confirm the response structure. If it still fails, check for rate limits on your plan and confirm the query parameters match what ScrapingRobot expects for Google SERP requests.
If your ScrapingRobot account supports it, the workflow can handle thousands of searches, and the template notes up to 5000 searches on a free account. On n8n Cloud, your monthly execution limit depends on your plan, while self-hosting has no fixed execution cap (it’s mainly your server capacity and API rate limits). Practically, most teams start with a weekly run of a few hundred keywords, then scale once the sheet structure is stable.
Often, yes. n8n handles looping through keyword lists, splitting organic results into rows, and calculating rank positions without getting expensive or awkward. You also get a self-hosting option, which is handy when you run lots of scheduled checks. Zapier or Make can be fine for very small keyword sets, but multi-step SERP parsing gets clunky fast. Talk to an automation expert if you want a quick recommendation for your volume and reporting style.
Once this is running, rank tables stop being a weekly fire drill. You’ll spend your time on decisions, not data collection.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.