Reddit to X with Google Sheets, posts stay consistent
Your content cadence shouldn’t die because you forgot to “check Reddit” on a busy day. But that’s what happens. You grab a promising post, rewrite it fast, post it, then later realize you already used the same idea last week.
This Reddit X automation hits marketers first (because consistency is the job), but founders and solo creators feel it too. You will turn rising Reddit posts into first-person X tweets, automatically, and log every post in Google Sheets so repeats stop happening.
Below is the exact workflow behavior, what it produces, and how to set it up without getting lost in tech jargon.
How This Automation Works
The full n8n workflow, from trigger to final output:
n8n Workflow Template: Reddit to X with Google Sheets, posts stay consistent
flowchart LR
subgraph sg0["Schedule Flow"]
direction LR
n0["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Code1"]
n1@{ icon: "mdi:robot", form: "rounded", label: "Tweet maker1", pos: "b", h: 48 }
n2@{ icon: "mdi:database", form: "rounded", label: "read database2", pos: "b", h: 48 }
n3@{ icon: "mdi:brain", form: "rounded", label: "Google Gemini Chat Model1", pos: "b", h: 48 }
n4@{ icon: "mdi:play-circle", form: "rounded", label: "Schedule Trigger1", pos: "b", h: 48 }
n5["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/x.dark.svg' width='40' height='40' /></div><br/>Creates the tweet1"]
n6@{ icon: "mdi:robot", form: "rounded", label: "Structured Output Parser2", pos: "b", h: 48 }
n7@{ icon: "mdi:cog", form: "rounded", label: "Get many posts in Reddit1", pos: "b", h: 48 }
n8@{ icon: "mdi:database", form: "rounded", label: "Append row in sheet1", pos: "b", h: 48 }
n9@{ icon: "mdi:swap-vertical", form: "rounded", label: "Edit Fields1", pos: "b", h: 48 }
n0 --> n1
n9 --> n5
n1 --> n9
n2 -.-> n1
n4 --> n0
n5 --> n8
n7 -.-> n1
n3 -.-> n1
n6 -.-> n1
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n4 trigger
class n1,n6 ai
class n3 aiModel
class n2,n8 database
class n0 code
classDef customIcon fill:none,stroke:none
class n0,n5 customIcon
The Problem: Staying consistent without reposting the same idea
Turning Reddit into X content sounds easy until you do it for two weeks straight. You have to pick a subreddit, scan what’s rising, open a handful of posts, decide what’s actually tweetable, then rewrite it so it sounds like you and not like a copied headline. The real killer is tracking. Without a simple log, you’ll reuse the same post ID or the same angle, and your feed starts feeling repetitive. Add in the pressure to post daily, and suddenly you are spending your best creative energy on busywork.
The friction compounds. Here’s where it breaks down in real life.
- You lose about 30 minutes per post just picking, rewriting, and formatting for X.
- Repeats happen because there’s no reliable “already used” check across weeks.
- Manual tracking in notes or bookmarks turns into a mess once you scale beyond a few subreddits.
- You post less often because the process requires your attention at the worst times.
The Solution: Automated Reddit-to-X posts with a no-repeat log
This workflow runs on a simple schedule and does the boring parts for you. Every couple of hours, it picks a subreddit from your preset list, pulls a rising Reddit post, and hands that content to an AI writing step (Gemini in the workflow data). The AI rewrites the idea into a short, punchy, first-person tweet so it reads like a human wrote it. Before anything gets posted, the workflow checks Google Sheets to confirm that Reddit post hasn’t been used already. If it’s new, it publishes to X via the X API and appends a clean record to your sheet (date, subreddit, post ID, and tweet text) so the workflow gets smarter over time.
The workflow starts on a schedule trigger (every 2 hours). A code step selects a subreddit, Reddit provides the rising content, and the AI agent composes the tweet with a structured output so formatting stays consistent. Then X publishes, and Google Sheets becomes your “memory” for duplicates and review.
What You Get: Automation vs. Results
| What This Workflow Automates | Results You’ll Get |
|---|---|
|
|
Example: What This Looks Like
Say you want 3 posts per day on X, pulled from 5 subreddits. Manually, you might spend 10 minutes finding a good Reddit post and another 10 minutes rewriting it, so about an hour a day when you include context-switching and formatting. With this workflow, you set the schedule once and it runs every 2 hours. Your “time cost” becomes a quick sheet review, maybe 10 minutes a day, while the rest is handled automatically.
What You’ll Need
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Google Sheets for logging and duplicate checks
- X (Formerly Twitter) to publish tweets via the API
- Google Gemini API key (get it from Google AI Studio)
Skill level: Intermediate. You’ll connect accounts, add API keys, and match a few fields to your Google Sheet columns.
Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).
How It Works
A schedule kicks everything off. The workflow triggers every 2 hours, so you don’t need to remember to run anything or babysit it.
A subreddit gets selected automatically. A small code step chooses from your preset array (for example: r/automation, r/n8n, r/SaaS), which keeps topics varied and avoids the “same vibe every day” problem.
Reddit content is fetched and rewritten into a tweet. The Reddit tool pulls a rising post, then the Gemini chat model plus the AI agent convert that idea into a short first-person post. A structured output parser keeps things predictable (tweet text, subreddit, post ID), which makes logging and filtering reliable.
X publishes and Google Sheets remembers. The workflow maps fields, publishes to X, and appends a row to your sheet with the date, subreddit, Reddit post ID, and tweet text so you can audit what went out.
You can easily modify the subreddit list to match your niche based on your needs. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Schedule Trigger
This workflow runs automatically on a schedule to kick off the Reddit selection and tweet composition chain.
- Add or open Scheduled Automation Start and set the schedule rule to run every
2hours (Field: hours, hoursInterval =2). - Confirm the execution flow starts as Scheduled Automation Start → Select Subreddit Logic.
- Optionally keep Flowpast Branding as a reference note; it has no execution impact.
Step 2: Connect Reddit Data Retrieval
Trending posts are pulled from Reddit as an AI tool so the agent can choose what to write about.
- Open Retrieve Reddit Posts and set Operation to
getAlland Limit to10. - Set Subreddit to
={{$fromAI('subreddit','name of the subreddit','string')}}. - Set the filter category to
risingin Filters. - Credential Required: Connect your redditOAuth2Api credentials. This tool is used by Compose Tweet Agent, so ensure the agent has access to the tool connection.
Step 3: Connect Google Sheets
Google Sheets is used to prevent duplicate Reddit post usage and to log the final tweets.
- Open Lookup Post History and set Document ID to
[YOUR_ID]and Sheet Name togid=0(sheetposts). - In Lookup Post History, set filters to: lookupColumn
subredditwith lookupValue={{ $fromAI('subreddit', `subreddit`, 'string') }}, and lookupColumnpost_idwith lookupValue={{ $fromAI('id', `id of the post`, 'string') }}. - Credential Required: Connect your googleSheetsOAuth2Api credentials. This tool is used by Compose Tweet Agent, so ensure the agent has access to the tool connection.
- Open Log Tweet Record and set Operation to
append, Document ID to[YOUR_ID], and Sheet Name togid=0. - Map columns in Log Tweet Record: Date
={{$now.format('dd/MM/yyyy')}}, post_id={{ $('Map Tweet Fields').item.json.post_id }}, subreddit={{ $('Map Tweet Fields').item.json.subreddit }}, PAST TWEETS={{ $('Map Tweet Fields').item.json.tweet }}. - Credential Required: Connect your googleSheetsOAuth2Api credentials.
[YOUR_ID] with your actual Google Sheet ID, or the lookup and logging steps will fail.Step 4: Set Up the Processing and AI Nodes
This step selects a subreddit or promo mode and generates a structured tweet using the AI agent and parser.
- Open Select Subreddit Logic and keep the JavaScript as provided to randomly choose between
advertiseand a subreddit while avoiding repeats. - Open Compose Tweet Agent and set Text to
={{ $json.tweet }}. - In Compose Tweet Agent, set Prompt Type to
defineand keep System Message as provided to enforce the tweet style and tool usage rules. - Open Structured Result Parser and set Schema Type to
manualwith the JSON schema that requirestweetand optionallysubredditandid. - Connect Gemini Chat Engine as the language model for Compose Tweet Agent. Credential Required: Connect your googlePalmApi credentials in Gemini Chat Engine.
- Confirm tool connections: Retrieve Reddit Posts and Lookup Post History are connected as AI tools, and Structured Result Parser is connected as the output parser for Compose Tweet Agent.
Step 5: Configure Publish Tweet Output
The agent’s structured output is mapped into tweet fields and then posted to Twitter.
- Open Map Tweet Fields and set assignments: tweet
={{$json.output.tweet}}, subreddit={{$json.output.subreddit || null}}, post_id={{ $json.output.id || null}}. - Open Publish Tweet and set Text to
={{ $json.tweet }}. - Set Additional Fields → Attachments to
={{ $json.image_id || null }}if you plan to include media. - Credential Required: Connect your twitterOAuth2Api credentials.
Step 6: Test and Activate Your Workflow
Verify that the full chain runs end-to-end and posts a tweet while logging to Google Sheets.
- Click Execute Workflow to run a manual test from Scheduled Automation Start.
- Confirm a structured output is produced by Compose Tweet Agent, then mapped by Map Tweet Fields.
- Check Twitter to ensure Publish Tweet successfully posted the text.
- Verify the row append in Log Tweet Record with the correct Date, post_id, subreddit, and PAST TWEETS values.
- Enable the workflow by toggling the Active switch for production use.
Common Gotchas
- Google Sheets credentials can expire or need specific permissions. If things break, check the n8n credential entry for Google Sheets and confirm the connected Google account still has access to the target spreadsheet.
- If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- X (Twitter) API access is picky about permissions and app status. If publishing suddenly fails, check your X developer dashboard for revoked tokens, missing write access, or an account-level restriction.
Frequently Asked Questions
About an hour if you already have the API access sorted.
No. You’ll mostly connect accounts and paste API keys. The only “code” you might touch is the subreddit list, and it’s a simple array you can copy and edit.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Gemini API usage and any X API costs tied to your developer plan.
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Yes, and you should. Update the subreddit array in the “Select Subreddit Logic” code node, then tighten the writing rules inside the Gemini Chat Engine / Compose Tweet Agent prompt so it matches your tone (calm, edgy, technical, whatever). You can also adjust the structured output to include extras like “hook type” or “CTA style,” then log those fields in Google Sheets for later analysis.
Usually it’s expired or revoked OAuth credentials in n8n, so reconnect your X account and try again. If that doesn’t fix it, check your X developer app permissions to confirm it still has write access. Rate limits can also bite if you crank the schedule too aggressively. Frankly, X changes policies more than most tools, so this is the first place I look.
If you self-host, there’s no execution limit (it depends on your server). On n8n Cloud, your monthly execution cap depends on plan, but this workflow is lightweight and typically handles a tweet every couple of hours without stress. The practical limit is usually your X API plan and how strict you want to be about duplicate filtering in Google Sheets.
Often, yes. This workflow benefits from logic like “check the sheet, then post, then log,” plus structured AI output, which is easier to control in n8n. Self-hosting is also a big deal if you want lots of runs without paying per task. Zapier and Make can still be fine for a simple “new row → post tweet” flow, but they get awkward when you add AI prompting, branching, and dedupe rules. If you’re on the fence, Talk to an automation expert and describe your posting goals.
Once this is running, your “post consistently” plan stops depending on motivation and free time. The workflow handles the repeatable parts, and your Google Sheet keeps you honest.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.