Google Sheets + Claude AI, SEO briefs ready to edit
Your SEO brief process probably isn’t “hard.” It’s just endless. Copy a keyword, Google it, open ten tabs, skim headings, paste notes into a doc, then try to turn all that into a meta title, description, H1, and a usable outline without missing something obvious.
This SEO briefs automation hits content strategists and SEO leads first. But agency owners juggling multiple clients and marketing managers shipping pages every week feel the drag too. The outcome is simple: briefs show up inside your Google Sheet, ready for a writer to pick up and run with.
Below you’ll see how the workflow pulls competitor structure, blends it with your client context, and generates consistent meta + H1 + outline output at scale.
How This Automation Works
The full n8n workflow, from trigger to final output:
n8n Workflow Template: Google Sheets + Claude AI, SEO briefs ready to edit
flowchart LR
subgraph sg0["When chat message received Flow"]
direction LR
n0@{ icon: "mdi:swap-horizontal", form: "rounded", label: "Filter1", pos: "b", h: 48 }
n1@{ icon: "mdi:swap-horizontal", form: "rounded", label: "If1", pos: "b", h: 48 }
n2@{ icon: "mdi:brain", form: "rounded", label: "Anthropic Chat Model2", pos: "b", h: 48 }
n3@{ icon: "mdi:cube-outline", form: "rounded", label: "Supabase Vector Store2", pos: "b", h: 48 }
n4@{ icon: "mdi:vector-polygon", form: "rounded", label: "Embeddings OpenAI2", pos: "b", h: 48 }
n5@{ icon: "mdi:robot", form: "rounded", label: "Structured Output Parser2", pos: "b", h: 48 }
n6@{ icon: "mdi:brain", form: "rounded", label: "Anthropic Chat Model", pos: "b", h: 48 }
n7@{ icon: "mdi:cube-outline", form: "rounded", label: "Supabase Vector Store", pos: "b", h: 48 }
n8@{ icon: "mdi:vector-polygon", form: "rounded", label: "Embeddings OpenAI", pos: "b", h: 48 }
n9["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Titre 1"]
n10["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Titre 2"]
n11["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Titre 3"]
n12["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Titre 4"]
n13["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Titre 5"]
n14["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Code"]
n15["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>Apify"]
n16@{ icon: "mdi:database", form: "rounded", label: "Update row in sheet", pos: "b", h: 48 }
n17@{ icon: "mdi:location-exit", form: "rounded", label: "Scrape 1", pos: "b", h: 48 }
n18@{ icon: "mdi:location-exit", form: "rounded", label: "Scrape 2", pos: "b", h: 48 }
n19@{ icon: "mdi:location-exit", form: "rounded", label: "Scrape 5", pos: "b", h: 48 }
n20@{ icon: "mdi:location-exit", form: "rounded", label: "Scrape 4", pos: "b", h: 48 }
n21@{ icon: "mdi:location-exit", form: "rounded", label: "Scrape 3", pos: "b", h: 48 }
n22@{ icon: "mdi:database", form: "rounded", label: "Client Information", pos: "b", h: 48 }
n23@{ icon: "mdi:database", form: "rounded", label: "SEO information", pos: "b", h: 48 }
n24@{ icon: "mdi:play-circle", form: "rounded", label: "When chat message received", pos: "b", h: 48 }
n25@{ icon: "mdi:swap-vertical", form: "rounded", label: "Loop Over Items", pos: "b", h: 48 }
n26@{ icon: "mdi:robot", form: "rounded", label: "Meta tag + h1", pos: "b", h: 48 }
n27@{ icon: "mdi:robot", form: "rounded", label: "Content brief", pos: "b", h: 48 }
n1 --> n25
n14 --> n16
n15 --> n17
n0 --> n1
n9 --> n18
n10 --> n21
n11 --> n20
n12 --> n19
n13 --> n26
n17 --> n9
n18 --> n10
n21 --> n11
n20 --> n12
n19 --> n13
n27 --> n14
n26 --> n27
n25 --> n15
n23 --> n0
n8 -.-> n7
n22 --> n23
n4 -.-> n3
n16 --> n25
n6 -.-> n27
n2 -.-> n26
n7 -.-> n27
n3 -.-> n26
n5 -.-> n26
n24 --> n22
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n24 trigger
class n5,n26,n27 ai
class n2,n6 aiModel
class n3,n7 ai
class n4,n8 ai
class n0,n1 decision
class n16,n22,n23 database
class n15 api
class n9,n10,n11,n12,n13,n14 code
classDef customIcon fill:none,stroke:none
class n9,n10,n11,n12,n13,n14,n15 customIcon
The Problem: SEO briefs are slow, inconsistent, and hard to scale
Writing a good brief takes research. And research takes time in the most annoying way: lots of small steps, lots of context switching, lots of “wait, where did I paste that?” moments. You start with a keyword list in Google Sheets, but the real work happens across search results, competitor pages, and whatever internal notes you can find. Then someone has to translate that mess into meta tags, an H1, intent guidance, and a clean outline that a writer can follow. Do it once and it’s fine. Do it for 50 keywords and it becomes your entire week.
It adds up fast. Here’s where it breaks down.
- Competitor research gets rushed, so briefs end up generic and you publish pages that don’t match what’s already ranking.
- Meta titles, descriptions, and H1s vary wildly by who wrote them, which means more rewrites and more approvals.
- Handing work to freelancers is painful because the “why” behind the outline rarely makes it into the doc.
- Scaling to multiple clients becomes a copy-paste operation, and mistakes slip in when you’re moving fast.
The Solution: Turn keyword rows into edit-ready briefs inside Google Sheets
This n8n workflow starts with a simple input: the URL of your template Google Sheet, which already contains your client context and your keyword list. From there, it reads the “Client Information” tab (business description, tone of voice, and constraints) and the “SEO” tab (page type, keyword, awareness level, and more). It filters for rows that actually need work, then processes keywords in batches so the workflow stays reliable and doesn’t overload your APIs. For each keyword, it grabs real search results, crawls competitor pages, extracts their heading structure, and feeds that competitive context into Claude to generate meta tags, an H1, and a detailed content brief. Finally, it writes everything back to the same Google Sheet, so the handoff is immediate.
The workflow kicks off when you send your Sheet URL to an n8n chat webhook. Then it runs SERP research (via an API), scrapes up to five competitor pages, and uses AI agents to draft meta elements and an outline that matches intent. Once it’s ready, the workflow updates each row in Google Sheets and moves on to the next keyword.
What You Get: Automation vs. Results
| What This Workflow Automates | Results You’ll Get |
|---|---|
|
|
Example: What This Looks Like
Say you have 30 keywords to brief for a new service category. Manually, you might spend about 10 minutes collecting SERP notes and headings per competitor, and you’ll usually check five competitors, so that’s roughly 50 minutes before you even draft the meta title, description, H1, and outline (call it another 10 minutes). That’s about an hour per keyword, or 30 hours total. With this workflow, you paste the Sheet URL once and let it run in batches; many teams see each row filled in after roughly 10–20 minutes of processing time, so you’re spending your effort reviewing and tweaking instead of building from scratch.
What You’ll Need
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Google Sheets for your keyword list and outputs
- Anthropic (Claude) to generate meta tags and briefs
- OpenAI API key (get it from the OpenAI dashboard)
- Apify API key (get it from Apify console for SERPs)
- Firecrawl API key (get it from Firecrawl for scraping)
- Supabase credentials (optional) (get it from Supabase project settings)
Skill level: Intermediate. You don’t need to code, but you should be comfortable adding API keys, testing one keyword row, and adjusting prompts if you want a specific house style.
Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).
How It Works
You send the workflow a Google Sheet URL. A chat webhook trigger receives the link, then n8n reads your template tabs (client info and SEO rows) using Google Sheets.
Rows get filtered and queued in batches. It skips empty keywords and avoids overwriting rows that already have an H1, then “Split in Batches” keeps execution stable when you’re processing a big list.
Competitor research runs automatically. n8n pulls SERP results through an HTTP request, crawls up to five competitor URLs with Firecrawl, and extracts the heading structure with small code steps so the AI has real inputs.
Claude generates meta + H1 + brief using your context. An AI Agent produces a meta title, meta description, and H1, then a second agent composes the full content brief. If you connect Supabase, the workflow can retrieve your approved client details to stay grounded.
The sheet updates itself. A final assembly step maps the AI output into the right columns and updates the matching row in Google Sheets, then the batch loop continues.
You can easily modify the number of competitors scraped to match your niche. You can also change how strict the character limits are if your CMS has different rules. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Webhook Trigger
Start the workflow when a chat webhook receives a Google Sheets URL to process.
- Add the Chat Webhook Trigger node and set Mode to
webhook. - Enable Public to
trueso the webhook can receive external requests. - Confirm that the incoming payload includes a
chatInputfield containing the Google Sheets URL used by downstream nodes.
{{$json.chatInput}} — test with a real sheet URL to avoid empty document references.Step 2: Connect Google Sheets for Client and SEO Data
Load client details and the SEO sheet from the URL passed by the webhook.
- In Fetch Client Details, set Sheet Name to
Client informationand Document ID to={{ $('Chat Webhook Trigger').item.json.chatInput }}. - Credential Required: Connect your googleSheetsOAuth2Api credentials in Fetch Client Details.
- In Retrieve SEO Sheet, set Sheet Name to
SEOand Document ID to={{ $('Chat Webhook Trigger').item.json.chatInput }}. - Credential Required: Connect your googleSheetsOAuth2Api credentials in Retrieve SEO Sheet.
- Verify the execution flow: Chat Webhook Trigger → Fetch Client Details → Retrieve SEO Sheet.
Step 3: Filter and Batch Keywords for Processing
Filter eligible keyword rows and iterate through them in batches for SERP analysis.
- In Filter Keyword Rows, keep the conditions that check
={{ $json['Keyword'] }}is not empty and={{ $json['<h1>'] }}is empty. - In Validate Keyword Presence, ensure the condition checks that
={{ $json['Keyword'] }}exists before continuing. - Use Batch Through Keywords to iterate the filtered rows; this will later loop back from Update Spreadsheet Row to process the next keyword.
- Confirm the flow: Retrieve SEO Sheet → Filter Keyword Rows → Validate Keyword Presence → Batch Through Keywords.
Step 4: Configure SERP Search and Competitor Crawling
Search the SERP and crawl competitor pages, then extract headings for AI analysis.
- In Search SERP via API, set URL to
https://api.apify.com/v2/acts/nFJndFXA5zjCTuudP/run-sync-get-dataset-itemsand keep Method asPOST. - Set JSON Body to the provided payload and keep the query line
"queries": "{{ $json['Keyword'] }}"intact. - Credential Required: Connect your httpHeaderAuth credentials in Search SERP via API.
- Connect Firecrawl credentials to all five crawl nodes: Crawl Result One, Crawl Result Two, Crawl Result Three, Crawl Result Four, and Crawl Result Five.
- Confirm each crawl node uses the proper URL expression, e.g., Crawl Result One → URL
={{ $('Search SERP via API').item.json.organicResults[0].url }}. - Keep the sequential extraction chain intact: Crawl Result One → Extract Headings One → Crawl Result Two → Extract Headings Two → Crawl Result Three → Extract Headings Three → Crawl Result Four → Extract Headings Four → Crawl Result Five → Extract Headings Five.
Step 5: Set Up AI Generation and Parsing
Generate SEO metadata and content briefs using Anthropic models, Supabase retrieval, and OpenAI embeddings.
- In Generate Meta & H1, keep the full prompt text and confirm the expressions referencing keywords, client details, and competitor headings are intact.
- Attach Anthropic Model Primary as the language model for Generate Meta & H1. Credential Required: Connect your anthropicApi credentials in Anthropic Model Primary.
- Attach Supabase Retriever Primary as a tool to Generate Meta & H1 and keep Tool Name as
BaseDeDonneeswith Table Name={{ $('Fetch Client Details').item.json['Supabase database'] }}_documents. Credential Required: Connect your supabaseApi credentials in Supabase Retriever Primary. - OpenAI Embeddings Primary supplies embeddings to Supabase Retriever Primary. Credential Required: Connect your openAiApi credentials in OpenAI Embeddings Primary. Add credentials to the parent nodes, not the sub-node connection.
- Attach Structured JSON Parser as the output parser for Generate Meta & H1, and keep the schema example as provided. Add parser settings to the parent node, not the parser sub-node.
- In Compose Content Brief, keep the prompt text that references
{{$('Generate Meta & H1').item.json.output.h1}}and competitor headings from all extraction nodes. - Attach Anthropic Model Secondary as the language model for Compose Content Brief. Credential Required: Connect your anthropicApi credentials in Anthropic Model Secondary.
- Attach Supabase Retriever Secondary with the same table expression
={{ $('Fetch Client Details').item.json['Supabase database'] }}_documentsand connect OpenAI Embeddings Secondary for embeddings. Credential Required: Connect your supabaseApi and openAiApi credentials in those nodes.
Step 6: Assemble and Update the SEO Sheet
Merge AI outputs with original rows and update the Google Sheet.
- In Assemble Sheet Update, keep the code that maps AI outputs to
<title>,<meta-desc>,<h1>, andbrief. - In Update Spreadsheet Row, set Operation to
updateand Sheet Name toFR. - Set Document ID to
={{ $('Chat Webhook Trigger').item.json.chatInput }}. - Ensure Matching Columns includes
mots clésfor the update match. - Credential Required: Connect your googleSheetsOAuth2Api credentials in Update Spreadsheet Row.
- Confirm the loop: Assemble Sheet Update → Update Spreadsheet Row → Batch Through Keywords to process the next row.
Step 7: Test & Activate Your Workflow
Validate end-to-end execution and enable the workflow for production use.
- Click Execute Workflow and send a test request to Chat Webhook Trigger with a valid Google Sheets URL in
chatInput. - Verify that Search SERP via API returns
organicResultsand that all Crawl Result and Extract Headings nodes produce heading outputs. - Confirm Generate Meta & H1 outputs structured fields matching
meta_title,meta_description, andh1from Structured JSON Parser. - Check the Google Sheet to ensure
<title>,<meta-desc>,<h1>, andbriefare updated in theFRsheet. - Once verified, toggle the workflow to Active to run in production.
Common Gotchas
- Google Sheets permissions are the quiet killer here. If the workflow can’t read or write, check the Google OAuth connection in n8n and confirm the shared Sheet is accessible to that Google account.
- If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- Firecrawl scraping can fail on aggressive sites or heavy JavaScript pages. When a crawl comes back empty, check the Firecrawl node output and consider lowering the number of competitor pages for that keyword.
- Default prompts in AI nodes are generic. Add your brand voice early or you’ll be editing outputs forever.
Frequently Asked Questions
About 45 minutes if you already have your API keys and the template sheet ready.
No. You’ll connect accounts, paste in API keys, and test a single keyword row first.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in API usage for Anthropic (Claude), Firecrawl, Apify, and OpenAI embeddings.
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Yes, and it’s worth doing. You can tweak the prompts in the “Generate Meta & H1” and “Compose Content Brief” AI Agent nodes to match your house style, deliverables, and strictness. If your team wants different columns, adjust the mapping in the “Assemble Sheet Update” code step before “Update Spreadsheet Row.” Common changes include adding FAQ suggestions, forcing a specific outline template, and tightening character limits for your CMS.
Usually it’s the wrong Google account or missing access to the Sheet. Reconnect Google Sheets OAuth in n8n, then open the template and confirm it’s shared with that same account. If the workflow can read but not write, check that the spreadsheet isn’t protected and that the target tab names match what the nodes expect. Also watch for changed column headers because the final update step maps into specific columns.
A lot, but the limiter is usually API cost and execution volume. On n8n Cloud, your monthly execution cap depends on your plan; self-hosting removes that cap, but you’re still bounded by server resources and third-party rate limits. Practically, teams often run 50–200 keywords in batches without babysitting, then schedule the rest overnight. If you crank up the competitor count or scrape heavy pages, it slows down and costs more.
For this kind of workflow, usually yes. You’re doing looping, batching, conditional filtering, multi-step scraping, and AI generation with structured parsing, which is where no-code “glue tools” can get expensive or brittle. n8n also gives you the option to self-host, which matters when you’re generating briefs in volume. Zapier or Make can still work for a lightweight version (keyword in, brief out), but competitor crawling and richer logic is frankly easier to maintain in n8n. If you want help deciding, Talk to an automation expert.
Once this is running, briefs stop being a bottleneck and start being a pipeline. You review, you polish, you ship.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.