🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 22, 2026

RSS to Notion, summarized news with OpenAI

Lisa Granqvist Partner Workflow Automation Expert

Keeping up with tech news sounds easy until you’re drowning in tabs, half-read newsletters, and “I’ll save this for later” links you never open again. Then you still have to write a summary, paste it somewhere, and hope you don’t save the same story twice.

This hits marketing leads who need fresh angles for campaigns, but founders tracking competitors and analysts doing market research feel it too. With RSS Notion automation, you get clean, searchable briefs in one place, without doing the copy-paste dance.

This workflow pulls stories from The Verge and TechCrunch, summarizes them with OpenAI, and stores everything in Notion. You’ll see what it fixes, what you need, and how the flow works before you touch anything.

How This Automation Works

See how this solves the problem:

n8n Workflow Template: RSS to Notion, summarized news with OpenAI

The Challenge: Turning Daily News Into Usable Research

RSS is great at delivering volume. It’s terrible at delivering clarity. You skim headlines, open a few posts, then lose the thread because the details are spread across a dozen articles and none of them are summarized in your words. If you do save them, you end up with a messy Notion page full of raw links and copy-pasted chunks that aren’t searchable in any meaningful way. And duplicates creep in quietly, so later you’re re-reading the same announcement you already captured last week.

It adds up fast. The friction compounds when you try to make this “a daily habit” and reality hits.

  • Manual collecting from TechCrunch and The Verge turns into a daily 30-minute chore that never feels “done.”
  • Duplicate links sneak into your notes, so your “database” becomes unreliable and annoying to use.
  • Summaries are inconsistent because you write them when you’re rushed, which means they’re hard to skim later.
  • Full article text is missing, so search inside Notion only finds titles and whatever you pasted.

The Fix: Auto-Summarized RSS Stories Saved to Notion

This workflow takes two high-signal tech feeds (The Verge and TechCrunch) and turns them into a structured Notion knowledge base you’ll actually use. It runs on a schedule (daily at 11 AM, though it’s disabled by default) or manually when you want to test. Each RSS item gets a unique SHA256 hash based on its URL, so duplicates can be caught early without guessing. If the story is new, the workflow fetches the full article page, extracts the body text, cleans it up, and asks OpenAI to produce a concise plain-text summary capped at about 1,500 characters. Finally, it creates a new Notion database page with the title, summary, date, source, URL, the hash, and the full cleaned text for search.

The workflow starts by reading both RSS feeds, then processes stories in batches so it can handle a normal news day without choking. After a Notion duplicate check, it only spends LLM time on genuinely new items. The end result is a tidy Notion row per article, ready for searching, tagging, or turning into a digest.

What Changes: Before vs. After

Real-World Impact

Say you review 20 new stories a day across two feeds and you usually spend about 5 minutes per story to open it, skim it, paste the link into Notion, and write a quick summary. That’s roughly 100 minutes a day. With this workflow, you’ll spend maybe 10 minutes scanning the new Notion entries and starring the ones worth sharing, while OpenAI handles the summaries and Notion storage in the background. Over a week, that’s about 7 hours you get back.

Requirements

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • Notion for storing briefs in a database.
  • OpenAI API to generate plain-text summaries.
  • Notion API access (create an internal integration in Notion).

Skill level: Beginner. You’ll connect Notion and OpenAI, then match a few database fields.

Need help implementing this? Talk to an automation expert (free 15-minute consultation).

The Workflow Flow

A scheduled run (or a manual test) starts everything. The workflow can run daily at a set time, and there’s also a manual trigger so you can test without waiting.

The Verge and TechCrunch feeds are pulled, then normalized. Each article URL gets converted into a SHA256 hash, which becomes the “fingerprint” used to detect duplicates quickly.

Only new articles move forward. For each item, the workflow checks your Notion database to see if that hash already exists. If it does, the item is skipped with no extra processing cost.

Full content is fetched, cleaned, summarized, and saved. New items trigger an HTTP fetch of the article page, HTML extraction (using the right CSS selectors per site), cleanup in JavaScript, then an OpenAI summary. A Notion page is created with the structured fields plus full cleaned text.

You can easily modify the RSS sources to include more sites based on your needs. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Schedule Trigger

Set up both the scheduled and manual triggers that kick off the RSS ingestion.

  1. Open Scheduled Automation Trigger and set the schedule rule to run at 11 (hour). Note the node is currently disabled—enable it when ready for production.
  2. Confirm Manual Execution Start is present for on-demand testing.
  3. Verify parallel execution: Scheduled Automation Trigger outputs to both Read TechCrunch Feed and Read Verge Feed in parallel.
  4. Verify the same parallel behavior for manual runs: Manual Execution Start outputs to both Read Verge Feed and Read TechCrunch Feed in parallel.
Keep Manual Execution Start connected so you can test the full workflow without waiting for the schedule.

Step 2: Connect RSS Sources and Generate Article Hashes

Configure the RSS feeds and create a unique hash for deduplication.

  1. In Read TechCrunch Feed, set URL to https://techcrunch.com/feed/.
  2. In Read Verge Feed, set URL to https://www.theverge.com/rss/index.xml.
  3. In Generate Hash B, set Type to SHA256, Value to {{ $json.link }}, and Data Property Name to hash.
  4. In Generate Hash A, set Type to SHA256, Value to {{ $json.link }}, and Data Property Name to hash.
⚠️ Common Pitfall: If the RSS feeds change their structure, the downstream HTML extraction may fail. Recheck selectors in Extract HTML Body A and Extract HTML Body B if summaries look empty.

Step 3: Check Notion for Existing Entries

Use Notion queries to prevent duplicate entries based on the hash.

  1. Open Query Notion Pages A and select the Notion database for Database ID (currently [YOUR_ID]). Credential Required: Connect your notionApi credentials.
  2. In Query Notion Pages A, confirm the filter condition uses Hash|rich_text equals = {{ $json.hash }}.
  3. Open Query Notion Pages B and select the Notion database for Database ID (currently [YOUR_ID]). Credential Required: Connect your notionApi credentials.
  4. In Query Notion Pages B, confirm the filter condition uses Hash|rich_text equals = {{ $json.hash }}.
  5. Check Check Existing Entry A and Check Existing Entry B use the expression {{ $item("0").$node["Query Notion Pages A"].json["id"] }} and {{ $item("0").$node["Query Notion Pages B"].json["id"] }} to detect duplicates.

Step 4: Fetch, Extract, and Assemble Full Article Text

Retrieve each article page and clean its HTML into a full-text field used for summarization.

  1. In Fetch Article Page A, set URL to {{ $('Generate Hash A').item.json.link }}.
  2. In Extract HTML Body A, set Operation to extractHtmlContent and the CSS selector to .duet--article--article-body-component p with Return Array enabled.
  3. In Assemble Full Text A, keep the JavaScript Code that filters empty paragraphs and joins content into fullArticle.
  4. In Fetch Article Page B, set URL to {{ $('Generate Hash B').item.json.link }}.
  5. In Extract HTML Body B, set Operation to extractHtmlContent and the CSS selector to .entry-content p with Return Array enabled.
  6. In Assemble Full Text B, keep the JavaScript Code that builds the fullArticle field from extracted paragraphs.

Step 5: Set Up AI Summarization

Use GPT to summarize each article before storing it in Notion.

  1. Open Summarize Article A and confirm Text is set to {{ $json.fullArticle }} with the custom prompt already defined.
  2. Ensure LLM Chat Engine A is connected as the language model for Summarize Article A. Credential Required: Connect your openAiApi credentials in LLM Chat Engine A.
  3. Open Summarize Article B and confirm Text is set to {{ $json.fullArticle }} with the same summarization prompt.
  4. Ensure LLM Chat Engine B is connected as the language model for Summarize Article B. Credential Required: Connect your openAiApi credentials in LLM Chat Engine B.
Keep summary length under 1500 characters to stay within Notion’s limits, as defined in the prompts for Summarize Article A and Summarize Article B.

Step 6: Configure Notion Output

Create Notion database pages for each summarized article and loop through items.

  1. Open Create Notion Entry A and select the Notion database for Database ID. Credential Required: Connect your notionApi credentials.
  2. In Create Notion Entry A, set Title to {{ $('Generate Hash A').item.json.title }} and map properties like Summary, Date, Hash, URL, Digest Status, source, and Full Article using the existing expressions.
  3. Open Create Notion Entry B and select the Notion database for Database ID. Credential Required: Connect your notionApi credentials.
  4. In Create Notion Entry B, set Title to {{ $('Generate Hash B').item.json.title }} and map properties such as Summary, Date, Hash, URL, Digest Status, source, and Full Article using the existing expressions.
  5. Confirm Create Notion Entry AIterate Articles A and Create Notion Entry BIterate Articles B to continue batch processing until Fetch Completed.

Step 7: Test & Activate Your Workflow

Run a manual test, verify data in Notion, and then enable the scheduled automation.

  1. Click Execute Workflow using Manual Execution Start to run both feeds in parallel.
  2. Confirm that Query Notion Pages A and Query Notion Pages B either skip existing items or pass new items to article fetching.
  3. Verify that Summarize Article A and Summarize Article B return summaries and Create Notion Entry A / Create Notion Entry B create pages in your Notion database.
  4. When results look correct, enable Scheduled Automation Trigger for production runs.
🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Watch Out For

  • Notion credentials can expire or need specific permissions. If things break, check your Notion integration access and the database sharing settings first.
  • If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
  • Default prompts in AI nodes are generic. Add your brand voice early or you’ll be editing outputs forever.

Common Questions

How quickly can I implement this RSS Notion automation automation?

About 30 minutes if your Notion database is ready.

Can non-technical teams implement this RSS Notion automation?

Yes. You’ll mostly be connecting accounts and mapping a few Notion properties.

Is n8n free to use for this RSS Notion automation workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in OpenAI API costs (often just a few cents per day for typical RSS volumes).

Where can I host n8n to run this automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

How do I adapt this RSS Notion automation solution to my specific challenges?

You can swap the RSS sources by adding another “RSS Read” node and reusing the same hash, Notion duplicate check, fetch, and summarize path. If the new site has different HTML structure, update the HTML extraction selectors in the “Extract HTML Body” node for that branch. Common tweaks include changing the summary style (more bullet-like, more opinion-free), adding tags in Notion based on keywords, or storing only the summary when you don’t need full text.

Why is my Notion connection failing in this workflow?

Usually it’s the database not being shared with your Notion integration, or an expired token. Re-check the integration permissions in Notion, then confirm the correct workspace and database are selected inside n8n.

What’s the capacity of this RSS Notion automation solution?

On self-hosted n8n, there’s no fixed execution cap (it depends on your server). On n8n Cloud, capacity depends on your plan, but this workflow is typically fine for daily RSS runs because it only summarizes new items after the Notion duplicate check. In practice, processing a day’s worth of stories usually finishes in minutes, not hours. If you add many more feeds, expect to tune batching and watch OpenAI rate limits.

Is this RSS Notion automation automation better than using Zapier or Make?

Often, yes. This workflow relies on branching logic, HTML extraction, JavaScript cleanup, and a “check then process” pattern to prevent duplicates, and that tends to get clunky (and pricey) in tools that charge per task. n8n also lets you self-host, which is a big deal once you start running daily research automations across multiple feeds. Zapier or Make can be totally fine for a simpler “RSS to Notion” link-saver, but you’ll miss the full-text extraction and consistent summarization unless you add more moving parts. If you want help deciding, Talk to an automation expert and we’ll map it to your setup.

Once this is running, your “news intake” stops being a bunch of tabs and becomes a searchable library. The workflow handles the repetitive stuff, so you can focus on decisions and ideas.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Launch login modal Launch register modal