YouTube to Airtable, podcast scripts ready to review
You post a YouTube video, and then the real grind starts. Finding the transcript, cleaning it up, removing filler, and turning it into something you’d actually record as a podcast script. It’s not hard work. It’s just endless work.
This YouTube Airtable automation hits podcast creators first, honestly. But content marketers repurposing long-form and video producers supporting a whole pipeline feel it too. The payoff is simple: new uploads turn into review-ready podcast script drafts without the copy-paste marathon.
Below, you’ll see exactly what the workflow does, what it saves you, and how to customize it so every script lands in Airtable in the format your team prefers.
How This Automation Works
The full n8n workflow, from trigger to final output:
n8n Workflow Template: YouTube to Airtable, podcast scripts ready to review
flowchart LR
subgraph sg0[" Watch for New YouTube Video via RSS Flow"]
direction LR
n0@{ icon: "mdi:play-circle", form: "rounded", label: " Watch for New YouTube Video..", pos: "b", h: 48 }
n1["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>Get Transcript from YouTube .."]
n2@{ icon: "mdi:robot", form: "rounded", label: "Transform Transcript into Po..", pos: "b", h: 48 }
n3["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/airtable.svg' width='40' height='40' /></div><br/>Save Podcast Script and Meta.."]
n0 --> n1
n1 --> n2
n2 --> n3
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n0 trigger
class n2 ai
class n3 database
class n1 api
classDef customIcon fill:none,stroke:none
class n1,n3 customIcon
The Problem: Turning YouTube transcripts into podcast scripts is a time sink
If you repurpose YouTube into an audio show, you already know the annoying part isn’t recording. It’s everything before that. You grab a transcript, discover it’s messy, then spend time deleting filler words, fixing weird line breaks, and trying to make speakers sound like actual humans. After that, you still need a title and summary that match your podcast style, plus a place to store everything so your editor isn’t digging through Google Docs, emails, and random notes.
It adds up fast. And once it’s part of your weekly publishing rhythm, the friction compounds.
- Transcripts are rarely “record-ready,” so you end up doing about an hour of cleanup per episode.
- When files live in scattered docs, someone always reviews an old version by accident.
- Summaries and titles get rushed at the end, which means weaker episode positioning.
- Manual steps invite small errors, like missing segments or mixing speaker lines.
The Solution: Auto-generate clean podcast scripts from every new upload
This workflow watches a YouTube channel feed for new uploads. As soon as a new video is published, it grabs the transcript using Dumpling AI, then sends that raw text into GPT-4o with instructions to turn it into something you can actually work with. The AI cleans up the transcript (less filler, better readability), organizes dialogue with speaker labels, and produces a podcast-friendly title and a concise summary. Finally, it stores the structured output in Airtable, so scripts are queued up and ready for review, editing, and handoff.
The workflow starts with a YouTube RSS trigger. Then Dumpling AI pulls the transcript from the video URL. GPT-4o transforms it into structured script content, and Airtable becomes the single place your team reviews and publishes from.
What You Get: Automation vs. Results
| What This Workflow Automates | Results You’ll Get |
|---|---|
|
|
Example: What This Looks Like
Say you publish 3 YouTube videos a week and turn each one into a podcast episode. Manually, grabbing the transcript (about 10 minutes), cleaning and structuring it (about 90 minutes), and writing a title and summary (about 20 minutes) puts you around 2 hours per episode, or roughly 6 hours a week. With this workflow, the “work” becomes checking Airtable and editing a draft, which is often 15–30 minutes per episode. That’s most of a workday back every week.
What You’ll Need
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Airtable for storing scripts and review status
- Dumpling AI to fetch the YouTube transcript
- OpenAI API key (get it from the OpenAI API dashboard)
Skill level: Beginner. You’ll paste in a few credentials, confirm field names in Airtable, and test with one recent video.
Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).
How It Works
A new YouTube upload triggers everything. n8n monitors your channel’s RSS feed, so the workflow kicks off the moment a new video appears.
The transcript gets pulled from the video URL. An HTTP request calls Dumpling AI’s transcript endpoint and retrieves the full text, which means you don’t touch YouTube Studio or third-party transcript sites.
GPT-4o turns raw transcript into a usable script. The workflow sends the transcript to GPT-4o and asks for structured JSON: a cleaned transcript, speaker labels, a podcast-ready title, and a concise summary.
Airtable becomes your review inbox. n8n writes the outputs into an Airtable table so you can assign an editor, add a status field, and keep publishing organized.
You can easily modify the prompt to match your show format, then change Airtable fields to fit your pipeline. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the RSS Trigger
Set up the RSS trigger to monitor your YouTube feed and start the workflow when a new video is published.
- Add and configure RSS New Video Monitor.
- Set Feed URL to
https://rss.app/feeds/Vw076Uzh7bIinpci.xml. - Confirm the polling schedule is set to every minute in Poll Times.
- Leave Flowpast Branding as-is; it is a sticky note for documentation and does not affect execution.
Step 2: Connect DumplingAI Transcript Retrieval
Configure the HTTP request to fetch a transcript for each new YouTube video.
- Open Fetch YouTube Transcript and set URL to
https://app.dumplingai.com/api/v1/get-youtube-transcript. - Set Method to
POSTand enable Send Body with Body Content Type set tojson. - Set JSON Body to
{ "videoUrl": "{{ $json.link }}", "preferredLanguage": "en" }. - Credential Required: Connect your httpHeaderAuth credentials for the DumplingAI API.
Step 3: Set Up Generate Podcast Script
Use OpenAI to clean the transcript, summarize it, and generate a title in JSON format.
- Open Generate Podcast Script and select the model
chatgpt-4o-latest. - Keep JSON Output enabled to ensure structured output.
- In the user message content, ensure the transcript input uses
{{ $json.transcript }}. - Credential Required: Connect your openAiApi credentials.
Step 4: Configure Airtable Output
Store the generated title, summary, and cleaned transcript in Airtable.
- Open Store Script in Airtable and set Operation to
create. - Select your Airtable Base (cached name shows
Testing n8n) and Table (cached name showspodcast). - Map fields in Columns as follows: Title →
{{ $json.message.content.title }}, summary →{{ $json.message.content.summary }}, podcast transcript →{{ $json.message.content.cleaned_transcript }}. - Credential Required: Connect your airtableTokenApi credentials.
Title, summary, and podcast transcript or the insert will fail.Step 5: Test and Activate Your Workflow
Validate the full execution flow and enable the workflow for live monitoring.
- Click Execute Workflow and trigger RSS New Video Monitor with a recent RSS item.
- Verify the execution flow: RSS New Video Monitor → Fetch YouTube Transcript → Generate Podcast Script → Store Script in Airtable.
- Confirm Airtable receives a new record with the title, summary, and cleaned transcript fields populated.
- Toggle the workflow to Active for continuous operation.
Common Gotchas
- Airtable credentials can expire or need specific permissions. If things break, check the Airtable personal access token scopes and the base/table access first.
- If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- Default prompts in AI nodes are generic. Add your brand voice early or you’ll be editing outputs forever.
Frequently Asked Questions
About 30 minutes if your accounts are ready.
No. You’ll connect accounts and paste in the RSS feed URL.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in OpenAI API costs (often just a few cents per episode) and Dumpling AI usage.
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Yes, and it’s mostly prompt work. You’ll adjust the instructions inside the “Generate Podcast Script with GPT-4o” node to match your show style (cold open, ad break placeholders, intro/outro wording, or a tighter summary). If you want Airtable to store more fields, add them to the JSON output and map them in the “Store Script in Airtable” node. Common tweaks include adding timestamps, producing a short teaser paragraph, and generating episode titles in a specific naming pattern.
Most of the time it’s a token issue. Regenerate your Airtable personal access token, make sure it has access to the right base, then re-select the base and table in the Airtable node so n8n refreshes field mappings. If the workflow worked once and then stopped, check whether the token was rotated or the table name changed. Also confirm your Airtable rate limits if you’re bulk-processing older videos.
On a typical n8n Cloud plan, you can handle thousands of workflow runs per month, and each new upload is usually one run.
Often, yes, if you care about structured outputs and control. n8n handles more complex logic without you paying extra for every branch, and self-hosting is an option when volume grows. It’s also easier to keep the “AI prompt + parsing + Airtable mapping” in one place instead of patching together multiple steps. Zapier or Make can be simpler for quick two-step zaps, though. Talk to an automation expert if you want a recommendation based on your publishing cadence.
Once this is running, new YouTube uploads quietly turn into organized Airtable drafts your team can review in minutes. The workflow handles the repetitive stuff, so you can focus on recording and publishing.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.