🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 22, 2026

DataForSEO + Google Search Console, client-ready audits

Lisa Granqvist Partner Workflow Automation Expert

You know the drill. A “quick” content audit turns into five tabs open, exports that don’t match, and a report you end up rewriting at midnight because the numbers feel off.

This is the pain SEO consultants deal with constantly. It also hits agency leads trying to standardize deliverables, and in-house marketers who need a reliable GSC audit automation without building a spreadsheet monster.

This workflow combines a DataForSEO crawl with Google Search Console metrics, then outputs a branded HTML audit you can send to a client. You’ll see what it automates, what results to expect, and what you need to run it.

How This Automation Works

The full n8n workflow, from trigger to final output:

n8n Workflow Template: DataForSEO + Google Search Console, client-ready audits

The Problem: SEO audits take too long to turn into client-ready reports

A content audit is supposed to create clarity. Instead, it often creates more work. You crawl a site in one tool, pull performance from Google Search Console in another, then spend hours trying to line up URLs, normalize data, and explain what matters in a way a client will actually understand. And if you’re doing this monthly, the busywork becomes a permanent tax on your calendar. Worse, manual audits drift. You change thresholds, skip checks when you’re rushed, and the “method” depends on who did the audit that week.

It adds up fast. Here’s where it usually breaks down.

  • Pulling crawl data and GSC metrics separately leads to mismatched URLs and confusing conclusions.
  • Manual issue spotting (thin content, duplicate metadata, orphan pages) is slow, and you end up sampling instead of auditing everything.
  • Turning findings into a polished, branded deliverable often takes as long as the analysis itself.
  • Scaling past a few hundred pages becomes a slog, so bigger sites get “good enough” audits.

The Solution: one automated crawl + GSC merge, delivered as a branded HTML audit

This n8n workflow runs a crawl in DataForSEO, waits until the task is complete, and pulls the raw audit data back into your automation. From there, it extracts the URLs you care about (like status 200 pages), then batches through those URLs to fetch clicks and impressions from Google Search Console. Once both datasets are in the same place, the workflow merges crawl findings with performance signals, builds a structured report model, and renders a branded HTML report with summaries, issue breakdowns, and prioritized recommendations. Finally, it exports the report as a downloadable HTML file, so you can share it as-is or drop it into your delivery process.

The workflow starts with you setting the domain, crawl limit (up to 1,000 pages), and branding fields. DataForSEO runs the crawl while n8n periodically checks status. When it’s done, the workflow enriches the crawl with GSC data, then produces a client-friendly HTML report that doesn’t need “one more formatting pass.”

What You Get: Automation vs. Results

Example: What This Looks Like

Say you run a monthly audit for a client site with about 500 indexable pages. Manually, you might spend about 2 hours pulling a crawl export, another hour cleaning URL formats, and about 2 more hours pulling GSC data, merging it, and packaging it into something client-ready. With this workflow, you set the domain and branding once, launch it, and wait about 20 minutes for the crawl/report run to finish. The output is a branded HTML audit you can send the same day, without rebuilding the report every time.

What You’ll Need

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • DataForSEO for website crawling and audit data.
  • Google Search Console to pull clicks and impressions per URL.
  • DataForSEO API credentials (get them from the DataForSEO API Access page).

Skill level: Intermediate. You’ll connect API credentials, edit a few fields, and run batches safely.

Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).

How It Works

You launch the workflow and set the inputs. In the “Assign Input Values” step, you choose the target domain, max crawl pages (up to 1,000), JavaScript rendering preference, and your branding details like logo URL and colors.

The workflow runs the crawl and checks progress. n8n creates the DataForSEO task, then loops with a short wait and a status check until the crawl is complete. No babysitting.

It enriches crawl URLs with Google Search Console performance. URLs are parsed, processed in batches, and each batch gets GSC clicks and impressions via an HTTP request to the Search Console API. If an API call needs a moment, the workflow pauses and retries.

Your audit becomes a branded HTML report. The workflow merges datasets, identifies issue categories (status problems, metadata, content quality, internal linking, performance), then renders an HTML report and exports it as a downloadable file.

You can easily modify crawl limits and issue thresholds based on your needs. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Manual Trigger

Start the workflow with the manual trigger so you can run the audit on demand during setup and testing.

  1. Add the Manual Launch Trigger node as the starting point.
  2. Keep default settings (no parameters required) to allow manual execution from the editor.

Step 2: Connect Data Inputs and Branding

Define the crawl target, company details, and branding used throughout the audit and HTML report.

  1. Open Assign Input Values and set dfs_domain to example.com.
  2. Set dfs_max_crawl_pages to 1000 and dfs_enable_javascript to false.
  3. Fill in branding fields: company_name Example Company, company_website https://example.com, company_logo_url https://example.com/logo.png.
  4. Set brand_primary_color to #252946 and brand_secondary_color to #0fd393.
  5. Set gsc_property_type to domain to use domain-level Search Console data.
You can change gsc_property_type to url-prefix if you want property-specific metrics instead of domain-wide data.

Step 3: Configure the DataForSEO Crawl and Polling Loop

These nodes submit a crawl task, check progress, and fetch the raw audit data once the crawl is complete.

  1. In Initiate Crawl Task, keep URL as https://api.dataforseo.com/v3/on_page/task_post and set JSON Body to the provided expression: =[{"target":"{{ $json.dfs_domain }}","max_crawl_pages": {{ $json.dfs_max_crawl_pages }},"load_resources": false,"enable_javascript": {{ $json.dfs_enable_javascript }},"custom_js":"meta = {}; meta.url = document.URL; meta;","tag":"{{ $json.dfs_domain + Math.floor(10000 + Math.random() * 90000) }}"}].
  2. Credential Required: Connect your httpBasicAuth credentials in Initiate Crawl Task, Verify Task Progress, Retrieve Raw Audit Data, and Retrieve Link Sources.
  3. In Verify Task Progress, set URL to =https://api.dataforseo.com/v3/on_page/summary/{{ $json.tasks[0].id }} and keep Content-Type header as application/json.
  4. In Branch Completion Check, set the condition to check {{ $json.tasks[0].result[0].crawl_progress }} equals finished.
  5. Set Pause Interval to wait 1 minutes to poll progress if the crawl is not finished.
  6. Configure Retrieve Raw Audit Data with URL https://api.dataforseo.com/v3/on_page/pages and JSON Body =[{"id":"{{ $json.tasks[0].id }}","limit":"1000"}].
⚠️ Common Pitfall: The DataForSEO nodes require Basic Auth credentials even though the nodes show no saved credentials by default. Add your credentials before testing or the crawl will fail.

Step 4: Enrich URLs with Search Console Metrics

This section extracts 200-status URLs, batches them, and pulls 90-day Search Console metrics for each page.

  1. In Parse Page URLs, keep the code that filters only status_code === 200 and outputs { url: page.url }.
  2. Set Batch Iterate URLs to batchSize 100 for scalable processing.
  3. In Fetch GSC Metrics, keep the dynamic URL expression that switches between domain and URL-prefix properties: {{ $('Assign Input Values').first().json.gsc_property_type === 'domain' ? 'https://searchconsole.googleapis.com/webmasters/v3/sites/' + 'sc-domain:' + $node["Batch Iterate URLs"].json.url.replace(/https?:\/\/(www\.)?([^\/]+).*/, '$2') + '/searchAnalytics/query' : 'https://searchconsole.googleapis.com/webmasters/v3/sites/' + encodeURIComponent($node["Batch Iterate URLs"].json.url.replace(/(https?:\/\/(?:www\.)?[^\/]+).*/, '$1')) + '/searchAnalytics/query' }}.
  4. Keep the Body expression that sets a 90-day range and filters by the page URL: {"startDate":"{{ new Date(new Date().setDate(new Date().getDate() - 90)).toISOString().split('T')[0] }}","endDate":"{{ new Date().toISOString().split('T')[0] }}","dimensionFilterGroups":[{"filters":[{"dimension":"page","operator":"equals","expression":"{{ $node['Batch Iterate URLs'].json.url }}"}]}],"aggregationType":"auto","rowLimit":100}.
  5. Credential Required: Connect your googleOAuth2Api credentials in Fetch GSC Metrics.
  6. Set Pause Retry to wait 1 minutes before mapping results if the Google API throttles.
  7. In Map GSC Metrics to URL, set URL to {{ $('Batch Iterate URLs').item.json.url }}, Clicks to {{ $('Fetch GSC Metrics').item.json.rows[0].clicks }}, and Impressions to {{ $('Fetch GSC Metrics').item.json.rows[0].impressions }}.
  8. Batch Iterate URLs outputs to both Combine GSC with Audit and Fetch GSC Metrics in parallel to enrich the audit while the batch loop runs.
If Search Console returns empty rows, ensure the property matches gsc_property_type and the URL format is consistent with the property.

Step 5: Build the Report Model and HTML Output

These nodes merge audit data with GSC metrics, gather link sources for 404/301 pages, and render the branded HTML report.

  1. In Combine GSC with Audit, keep the JavaScript that merges GSC metrics into each page’s googleSearchConsoleData.
  2. Use Filter 404 and 301 to isolate error and redirect URLs from Retrieve Raw Audit Data.
  3. Batch Link Sources outputs to both Assemble Report Model and Retrieve Link Sources in parallel to enrich 404/301 pages with source links.
  4. In Retrieve Link Sources, keep URL as https://api.dataforseo.com/v3/on_page/links and JSON Body as =[{"id":"{{ $('Retrieve Raw Audit Data').first().json.tasks[0].id }}","page_to":"{{ $json.url }}"}].
  5. In Format Link Source Data, keep the mapping of link_from, type, and text into a structured sources array.
  6. In Assemble Report Model, keep the long-form JavaScript that categorizes issues and builds the summary, issues, and pages output.
  7. In Render HTML Report, keep the HTML template logic referencing Assign Input Values and Assemble Report Model.
  8. In Export HTML File, set Operation to toText, Source Property to html, and Binary Property Name to content audit report.
⚠️ Common Pitfall: If Retrieve Link Sources doesn’t have valid DataForSEO credentials, 404/301 source link sections will be empty in the HTML report.

Step 6: Test and Activate Your Workflow

Run a manual test to validate data collection and verify that the HTML report is generated correctly.

  1. Click Execute Workflow starting from Manual Launch Trigger to run the audit end-to-end.
  2. Confirm that Retrieve Raw Audit Data returns crawl items and Fetch GSC Metrics returns rows for each URL.
  3. Verify Render HTML Report outputs a full HTML string and Export HTML File generates a downloadable file.
  4. When results look correct, toggle the workflow to Active to use it in production runs.
🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Common Gotchas

  • DataForSEO credentials can expire or need specific permissions. If things break, check your DataForSEO API access status and the n8n credential assignment on the crawl-related HTTP Request nodes first.
  • If you’re using Wait nodes or external processing, timing varies. Bump up the wait duration if downstream nodes fail on empty responses while the crawl task is still finishing.
  • Default AI prompts can be generic. Add your audit “voice” and recommendation style early in the AI Agent/OpenAI steps or you’ll be editing outputs forever.

Frequently Asked Questions

How long does it take to set up this GSC audit automation?

About 30 minutes if you already have DataForSEO and Search Console access ready.

Do I need coding skills to automate GSC audit reporting?

No. You will connect credentials and edit a few fields in n8n. The workflow’s logic is already built.

Is n8n free to use for this GSC audit automation workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in DataForSEO API usage (they include a small test credit) and any Google API limits on your account.

Where can I host n8n to run this automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

Can I customize this GSC audit automation workflow for white-label branding and different audit thresholds?

Yes, and you should. Update branding fields (company name, logo URL, primary/secondary colors) in the “Assign Input Values” step, then adjust issue thresholds in the report-building logic, such as thin content word count (currently 1500), click depth flags (currently deeper than 4), and readability (currently below 55). If you want a different title/description policy, modify the title and description length checks as well. Many teams also add an extra delivery step after the “Export HTML File” output to send the report by email or upload it to storage.

Why is my DataForSEO connection failing in this workflow?

Usually it’s incorrect Basic Auth credentials or the credential isn’t assigned to every DataForSEO HTTP Request node. Double-check the “Initiate Crawl Task” and “Retrieve Raw Audit Data” nodes first, then confirm your DataForSEO API access is active. If it still fails, it can be rate limiting on larger runs, so lowering batch pressure and extending waits can help.

How many pages can this GSC audit automation handle?

Up to 1,000 crawled pages per run, and it batches the GSC lookups so the enrichment stays manageable.

Is this GSC audit automation better than using Zapier or Make?

Often, yes. This workflow relies on looping, batching, waiting for long-running crawl tasks, and merging datasets reliably, which is where n8n tends to be more flexible (and cheaper to run at volume if you self-host). Zapier or Make can work for simple “send me a report” flows, but this type of audit build usually gets awkward once you add batching and retries. Also, generating a full branded HTML report is easier when you control the logic end-to-end. If you’re torn, Talk to an automation expert and you’ll get a straight recommendation based on your volume and delivery needs.

Set it up once and your audits stop being a recurring fire drill. The workflow handles the repeatable work so you can focus on decisions, not data-wrangling.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Launch login modal Launch register modal