🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 22, 2026

Reddit + Google Sheets: SEO drafts ready to edit

Lisa Granqvist Partner Workflow Automation Expert

Content ideas are everywhere. But turning them into publishable drafts usually means a lot of tab-hopping, copy-paste, and “I’ll write it later” lying to yourself.

If you’re a marketing lead trying to keep a content calendar full, this Reddit SEO drafts automation is a relief. Agency owners feel it when clients want “more SEO content” without paying for more hours. And if you run a small business, you just want a backlog that doesn’t rely on your mood.

This workflow pulls real questions from Reddit and drops clean, structured blog drafts into Google Sheets. You’ll learn what it does, what you need, and how to get it running without becoming an automation engineer.

How This Automation Works

Here’s the complete workflow you’ll be setting up:

n8n Workflow Template: Reddit + Google Sheets: SEO drafts ready to edit

Why This Matters: Turning “Good Questions” Into Drafts Takes Forever

You already know Reddit is a goldmine for content. People ask blunt, specific questions, and they use the same phrasing your customers type into Google. The problem is what happens next. You skim posts, save a few links, maybe dump them into a doc… and then nothing ships. Or you finally sit down to write and spend the first hour just turning a messy question into a usable title, outline, and intro. It’s not hard work. It’s draining work. And it steals time from the part that actually matters: editing, adding experience, and publishing consistently.

It adds up fast. Here’s where the workflow earns its keep.

  • Finding questions is easy, but organizing them into a real backlog is the part that quietly breaks your publishing rhythm.
  • When the same question gets rewritten from scratch each time, your titles, slugs, and outlines end up inconsistent across your site.
  • Manual drafting invites small errors, like mismatched headings or vague intros, which means more editing later.
  • “We’ll write from Reddit later” usually turns into “we forgot where we saved that thread.”

What You’ll Build: Reddit Questions to Structured Google Sheets Drafts

This workflow starts when you choose to run it (a manual trigger, so you stay in control). It pulls the latest posts from a subreddit you pick, then filters down to posts that look like real questions based on common “question words” (think how, what, why). Those question-style posts get logged into a Google Sheet as raw input, so you have a clean source list. Then AI steps in: it rephrases the question into a clearer topic, generates a proper blog title and URL-friendly slug, and writes a structured draft in sections. Finally, the workflow saves the title, slug, intro, steps, and conclusion into a second Google Sheet that’s ready for editing and publishing.

In practice, you get two Sheets working together. One becomes your “question inbox.” The other becomes your “draft library,” where each row is a draft you can assign, edit, and ship.

What You’re Building

Expected Results

Let’s say you want 10 fresh SEO topics a week from one subreddit. Manually, you might spend about 30 minutes scanning, selecting questions, pasting them into a doc, and writing rough outlines, so that’s about 5 hours before “real writing” even begins. With this workflow, you can run it in about 2 minutes, then let AI generate the structured sections in the background. You still edit before publishing, but you’re starting from a draft library, not an empty page.

Before You Start

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • Reddit API access to pull posts from a subreddit.
  • OpenAI API to generate titles, sections, and slugs.
  • Google Sheets to store questions and finished drafts.

Skill level: Beginner. You’ll connect accounts, paste API keys, and tweak a couple prompts and sheet column names.

Want someone to build this for you? Talk to an automation expert (free 15-minute consultation).

Step by Step

You trigger a run when you want new drafts. This workflow uses a manual launch, which is great if you want to review what’s coming in from Reddit and avoid daily “set-and-forget” noise.

Reddit posts are pulled and filtered into real questions. n8n retrieves the latest subreddit posts, then a filter checks titles for question phrasing. Only the likely questions make it into your “query” Google Sheet.

AI cleans up the topic and writes the structured sections. The workflow loops through each saved question, rephrases it into a cleaner blog topic, generates a URL slug, then drafts an intro, step-by-step guide section, and conclusion using OpenAI.

The finished draft lands in a separate Google Sheet. Each row becomes an editable draft record (title, slug, intro, steps, conclusion) so you can assign it, rewrite parts, and publish using your normal process.

You can easily modify the subreddit source to target a different audience based on your needs. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Manual Trigger

Start the workflow manually to pull fresh Reddit questions on demand.

  1. Add the Manual Launch Trigger node as the workflow entry point.
  2. Connect Manual Launch Trigger to Retrieve Reddit Posts to start the chain.

Step 2: Connect Reddit as the Data Source

Pull new posts from the n8n subreddit for question extraction.

  1. Open Retrieve Reddit Posts and set Operation to getAll.
  2. Set Subreddit to n8n and Limit to 30.
  3. Credential Required: Connect your redditOAuth2Api credentials.
  4. Connect Retrieve Reddit Posts to Filter Question Titles.
⚠️ Common Pitfall: If Retrieve Reddit Posts returns empty data, confirm the OAuth scope allows read access and the subreddit name is correct.

Step 3: Filter Questions and Log to the Query Sheet

Identify question-style titles, then save them for downstream processing.

  1. In Filter Question Titles, set JavaScript Code to:
  2. const questionWords = ['who', 'what', 'when', 'where', 'why', 'how', 'can', 'does', 'is', 'should', 'do', 'are', 'could', 'would'];

    return $input.all().filter(item => {
    const rawTitle = item.json.title;
    if (!rawTitle) return false;

    const title = rawTitle.trim().toLowerCase();

    const isQuestion =
    title.endsWith('?') ||
    questionWords.some(word =>
    title.startsWith(word + ' ') ||
    title.includes(' ' + word + ' ')
    );

    // 🪵 Debug log goes here:
    if (isQuestion) console.log('✅ Question found:', title);

    return isQuestion;
    });
  3. Open Append Query Sheet and set Operation to append.
  4. Map Query to {{ $json.selftext }} and Title to {{ $json.title }}.
  5. Set Document to [YOUR_ID] and Sheet to Queries.
  6. Credential Required: Connect your googleSheetsOAuth2Api credentials.
  7. Connect Append Query Sheet to Batch Iterator.
Use Batch Iterator to process each question one at a time and avoid token spikes in the AI steps.

Step 4: Set Up AI Processing and Rephrasing

Rephrase the selected question and prepare it for content generation.

  1. Connect Batch Iterator to Rephrase Question Agent (main output).
  2. In Rephrase Question Agent, set Text to =Here is a question: {{ $json.Title }} Rephrase it without changing the meaning. Keep it as a question. I just need the question as the output nothing else.
  3. Ensure Primary Chat Model is connected as the language model for Rephrase Question Agent and select model gpt-4o-mini.
  4. Credential Required: Connect your openAiApi credentials on Primary Chat Model (not on Rephrase Question Agent).
  5. Attach Context Memory Buffer as AI memory and set Session Key to {{ $json.Title }}.
  6. Connect Rephrase Question Agent to Map Question Name and map name to {{ $json.output }}.
Memory nodes like Context Memory Buffer do not take credentials directly—add credentials to the parent chat model nodes.

Step 5: Configure the Blog Sheet Upsert and Parallel AI Drafting

Store the rephrased title and generate slug, intro, steps, and conclusion in parallel.

  1. In Upsert Blog Sheet, set Operation to appendOrUpdate.
  2. Map name to {{ $json.name }}, set Document to [YOUR_ID], and Sheet to Blog.
  3. Credential Required: Connect your googleSheetsOAuth2Api credentials.
  4. Upsert Blog Sheet outputs to both Draft Intro Section and Draft Steps Section and Draft Conclusion and Generate URL Slug in parallel.
  5. Configure AI prompts and memory for each draft node:
    Generate URL Slug Text: =Based on {{ $json.name }} Create a website slug for it. For example: best-website-builder Just give me the slug as output nothing else with Slug Chat Model and Slug Memory Buffer (Session Key {{ $json.name }}).
    Draft Intro Section Text: =Write a short intro for a blog post titled: {{ $json.name }} Make it easy to read, with easy vocabulary Just give me the intro as the output with Intro Chat Model and Intro Memory Buffer (Session Key {{ $('Upsert Blog Sheet').item.json.name }}).
    Draft Steps Section Text: =Write a 'step by step guide' section for a blog post titled: {{ $json.name }} Make it easy to read, with easy vocabulary But make it very detailed Just give me the output with Steps Chat Model and Steps Memory Buffer (Session Key {{ $('Upsert Blog Sheet').item.json.name }}).
    Draft Conclusion Text: =Write a short conclusion for a blog post titled: {{ $json.name }} Make it easy to read, with easy vocabulary Just give me the conclusion as the output with Conclusion Chat Model and Conclusion Memory Buffer (Session Key {{ $('Upsert Blog Sheet').item.json.name }}).
  6. Credential Required: Connect your openAiApi credentials on Slug Chat Model, Intro Chat Model, Steps Chat Model, and Conclusion Chat Model.
Parallel execution speeds up content generation, but each branch must map its output to a unique field before combining.

Step 6: Map AI Outputs and Append the Blog Record

Assign each generated section to a field, then write the full record to the blog sheet.

  1. In Assign Slug Field, map slug to {{ $json.output }}.
  2. In Assign Intro Field, map Intro to {{ $json.output }}.
  3. In Assign Steps Field, map Steps to {{ $json.output }}.
  4. In Assign Conclusion Field, map Conclusion to {{ $json.output }}.
  5. Each of Assign Slug Field, Assign Intro Field, Assign Steps Field, and Assign Conclusion Field should connect to Append Blog Output.
  6. In Append Blog Output, set Operation to append and map: name{{ $json.name }}, slug{{ $json.slug }}, Intro{{ $json.Intro }}, Steps{{ $json.Steps }}, Conclusion{{ $json.Conclusion }}.
  7. Credential Required: Connect your googleSheetsOAuth2Api credentials.
  8. Connect Append Blog Output back to Batch Iterator to continue batch processing.

Step 7: Test and Activate Your Workflow

Run a manual test to confirm records and AI content are written to your sheets.

  1. Click Execute Workflow and trigger Manual Launch Trigger.
  2. Verify that Append Query Sheet adds rows to the Queries sheet and that Upsert Blog Sheet and Append Blog Output populate the Blog sheet.
  3. Confirm that the slug, intro, steps, and conclusion fields contain AI-generated text for each question.
  4. When everything looks correct, toggle the workflow to Active for production use.
🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Troubleshooting Tips

  • Reddit credentials can expire or need specific permissions. If things break, check your Reddit app settings in the Reddit developer portal first.
  • If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
  • Default prompts in AI nodes are generic. Add your brand voice early or you’ll be editing outputs forever.

Quick Answers

What’s the setup time for this Reddit SEO drafts automation?

About 20 minutes if your accounts are ready.

Is coding required for this SEO drafts automation?

No. You’ll mostly connect credentials and adjust a couple prompts and sheet columns.

Is n8n free to use for this Reddit SEO drafts workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in OpenAI API costs (often a few cents per draft, depending on length).

Where can I host n8n to run this automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

Can I modify this Reddit SEO drafts workflow for different use cases?

Yes, and you should. Most people swap the subreddit in the Reddit retrieval step, then adjust the AI prompts used for the intro, steps, and conclusion so the draft matches their niche and tone. You can also change the “question filter” logic to catch different phrasing (for example, “best way to” or “vs”). If you publish in multiple languages, keep the Google Translate node in the flow and route translated drafts into a separate sheet tab.

Why is my Reddit connection failing in this workflow?

Usually it’s an OAuth issue on the Reddit app side. Double-check the redirect URL, then reconnect the Reddit credential in n8n so the token refreshes. If it works for a few runs and then dies, it can be missing scopes or the credential was created under a different Reddit account than you expected.

What volume can this Reddit SEO drafts workflow process?

If you keep batches reasonable (like 10–30 questions per run), it’s smooth on most setups.

Is this Reddit SEO drafts automation better than using Zapier or Make?

Often, yes, because this workflow isn’t just moving data between apps. You’re filtering, looping through items in batches, and generating multiple content sections with AI, which is where simpler tools can get awkward or expensive. n8n also gives you more control over branching logic, so you can route “good questions” to one place and “maybe later” questions to another. If you self-host, you’re not paying per task in the same way, which matters when you’re generating drafts in bulk. That said, Zapier or Make can be quicker for very small, two-step flows. Talk to an automation expert if you want help picking the right tool.

Once this is running, your content backlog stops being a “someday” project and starts behaving like an actual system. You edit, polish, publish, repeat.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Launch login modal Launch register modal