Baserow + Mailchimp: price trends sent hands free
You check competitor prices, jot them in a sheet, forget a week, then scramble when a “sudden” price drop shows up in your ads or sales calls. It’s not hard work. It’s the constant checking that breaks you.
This price trend automation hits eCommerce marketers first, because promos and positioning depend on timing. But store owners and ops leads feel it too, since inventory and margins get decided with half-complete info.
This workflow scrapes your product pages, stores clean price history in Baserow, then sends a weekly Mailchimp trend email and a Slack alert when something moves a lot. You’ll see what it does, what you need, and how to run it safely.
How This Automation Works
Here’s the complete workflow you’ll be setting up:
n8n Workflow Template: Baserow + Mailchimp: price trends sent hands free
flowchart LR
subgraph sg0["Weekly Schedule Flow"]
direction LR
n0@{ icon: "mdi:play-circle", form: "rounded", label: "Weekly Schedule", pos: "b", h: 48 }
n1["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Define Product Sources"]
n2@{ icon: "mdi:swap-vertical", form: "rounded", label: "Iterate Products", pos: "b", h: 48 }
n3@{ icon: "mdi:cog", form: "rounded", label: "Scrape Product Page", pos: "b", h: 48 }
n4["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Clean & Enrich Data"]
n5@{ icon: "mdi:swap-horizontal", form: "rounded", label: "Has Price Data?", pos: "b", h: 48 }
n6["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/baserow.svg' width='40' height='40' /></div><br/>Store Price Record"]
n7@{ icon: "mdi:swap-horizontal", form: "rounded", label: "Is Price Below Threshold?", pos: "b", h: 48 }
n8@{ icon: "mdi:swap-vertical", form: "rounded", label: "Prepare Mailchimp Content", pos: "b", h: 48 }
n9["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Log Missing Price"]
n10["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Error Handler"]
n11["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/slack.svg' width='40' height='40' /></div><br/>Send a message"]
n5 --> n6
n5 --> n9
n0 --> n1
n2 --> n3
n6 --> n7
n4 --> n5
n3 --> n4
n3 --> n10
n1 --> n2
n7 --> n8
n8 --> n11
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n0 trigger
class n5,n7 decision
class n1,n4,n9,n10 code
classDef customIcon fill:none,stroke:none
class n1,n4,n6,n9,n10,n11 customIcon
Why This Matters: Weekly price checks don’t scale
Manual price monitoring always starts “small.” A handful of competitor URLs. A quick check on Monday. Then your catalog expands, seasons change, and you suddenly have 40 pages to open, copy, and sanity-check. The worst part is the mental overhead: you’re never sure if a price change is real, a temporary promotion, or just you misreading a variant selector. Miss two weeks and you lose the story, which means you make decisions on vibes instead of trends.
It adds up fast. Here’s where it usually breaks down.
- Copy-pasting prices from multiple sites turns into an hour-long ritual, and it still leaves gaps when a page fails to load or the price is hidden behind a selector.
- Teams end up with “price history” spread across tabs and tools, so nobody trusts the numbers enough to act quickly.
- A big competitor move can sit unnoticed for days, which means your paid spend and promo calendar drift out of sync.
- When you finally send an update email, it’s messy and inconsistent because the formatting depends on whoever had time that week.
What You’ll Build: Baserow price history + Mailchimp trend emails + Slack alerts
This workflow runs on a weekly schedule and pulls in a list of product URLs you care about. It scrapes each product page using the ScrapeGraphAI community node, then normalizes the result so the price is stored consistently (even when different sites format pricing differently). If a valid price is found, the workflow upserts a record into a Baserow table called price_tracker, building a clean history over time. After that, it checks for “big move” situations and posts a Slack alert so you can react while it still matters. Finally, it composes a neat Mailchimp-ready summary so your team (or stakeholders) get a weekly trend digest without you writing a single email by hand.
The workflow starts with a scheduled trigger, then loops through your URLs in small batches to avoid rate limits. ScrapeGraphAI extracts the product name and current price, Baserow stores it, and logic flags notable changes. You end the week with both a database you can trust and messaging that’s already sent.
What You’re Building
| What Gets Automated | What You’ll Achieve |
|---|---|
|
|
Expected Results
Say you monitor 30 products across a few competitor sites. Manually, if it takes maybe 4 minutes to open a page, find the right variant, copy the price, and log it, that’s about 2 hours every week. With this workflow, you update the URL list once, then the weekly run happens automatically: a couple minutes to review the Slack alerts and skim the Mailchimp report, and you’re done. Most teams get roughly 90 minutes back per week right away, and the data quality is honestly the bigger win.
Before You Start
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Baserow for storing a clean price history table.
- Mailchimp to send the weekly trend email to an audience.
- ScrapeGraphAI API Key (get it from your ScrapeGraphAI account dashboard)
Skill level: Intermediate. You’ll paste API keys, map a few fields, and test-run the workflow once.
Want someone to build this for you? Talk to an automation expert (free 15-minute consultation).
Step by Step
A weekly schedule kicks things off. The Scheduled Weekly Trigger runs on your chosen cadence (weekly by default, but you can switch it to daily during high-season).
Your product URL catalog is prepared and batched. A Code node builds the list of URLs or SKUs, then Split in Batches processes them in small chunks so scraping doesn’t trip rate limits or timeouts.
Scraping and validation happens per product. ScrapeGraphAI extracts product name and price from each page, then a normalization step cleans the value (currency symbols, commas, odd formatting). An If check filters out empty or failed scrapes so you don’t pollute your history.
Results are stored and broadcast. Baserow upserts the latest price snapshot, an alert rule checks for big moves, Slack gets the heads-up, and a Set node composes the Mailchimp payload for a tidy weekly report.
You can easily modify the URL list and alert thresholds to match your catalog and margin rules. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Scheduled Weekly Trigger
Set the workflow to run on a weekly cadence using the built-in schedule trigger.
- Add the Scheduled Weekly Trigger node as your trigger.
- In Scheduled Weekly Trigger, keep the rule interval set to weeks (already configured by the node).
- Connect Scheduled Weekly Trigger to Product URL Catalog.
Step 2: Connect the Product List and Batch Processing
Define the product URLs to scrape and process them in batches.
- Open Product URL Catalog and set jsCode to the provided product list.
- Update URLs or add products directly in the jsCode block.
- Connect Product URL Catalog to Batch Item Iterator.
- Connect Batch Item Iterator to Page Data Scraper for per-item scraping.
Step 3: Set Up the Scraping and Normalization Path
Scrape product pages and normalize price data. This step includes a parallel error logging branch.
- In Page Data Scraper, set userPrompt to
Extract the following as JSON: {"name": "string", "currentPrice": "string", "currency": "string", "availability": "string"}. Make sure numbers include currency symbols if present. - Set websiteUrl to
{{ $json.url }}. - Connect Page Data Scraper to Normalize Price Data and Scrape Error Logger in parallel.
- In Normalize Price Data, keep jsCode as provided to parse price, currency, availability, and
scrapedAt.
Page Data Scraper outputs to both Normalize Price Data and Scrape Error Logger in parallel.
Step 4: Validate and Store Price Entries
Only store valid price data and log missing values.
- In Validate Price Presence, confirm the condition is Number with value1 set to
{{ $json.price }}, operation set tolarger, and value2 set to0. - Connect the true output of Validate Price Presence to Write Price Entry.
- Connect the false output of Validate Price Presence to Log Missing Price Data.
- In Write Price Entry, set tableId to
{{ $env.BASEROW_TABLE_ID || 1 }}and operation tocreate.
Step 5: Configure Price Alerts and Slack Delivery
Trigger alerts when prices fall below your threshold and send the alert to Slack.
- In Check Price Threshold, confirm the condition is Number with value1 set to
{{ $json.price }}and value2 set to{{ $env.PRICE_ALERT_THRESHOLD || 50 }}. - Connect Write Price Entry to Check Price Threshold.
- Connect Check Price Threshold to Compose Mailchimp Payload.
- Connect Compose Mailchimp Payload to Slack Alert Dispatch.
Step 6: Test and Activate Your Workflow
Run a manual test to verify scraping, validation, storage, and alert delivery before enabling production.
- Click Execute Workflow to trigger a manual run from Scheduled Weekly Trigger.
- Verify that Page Data Scraper returns a JSON payload and that Normalize Price Data outputs structured fields like
price,currency, andscrapedAt. - Confirm rows are created in Baserow by Write Price Entry for valid prices.
- If the price is below the threshold, confirm Slack Alert Dispatch sends the alert.
- Once successful, toggle the workflow to Active to enable weekly automation.
Troubleshooting Tips
- Baserow credentials can expire or need specific permissions. If things break, check your Baserow API token and table access (especially the price_tracker field names) first.
- If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- ScrapeGraphAI outputs can change when a site redesigns a product page. If prices suddenly go missing, review the ScrapeGraphAI extraction settings and confirm the price selector still matches the page.
Quick Answers
About 10–15 minutes if your accounts and table are ready.
No. You’ll mainly paste credentials and edit the URL list in the “Product URL Catalog” node.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in ScrapeGraphAI API costs (based on your plan and scrape volume).
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Yes, and you should. Replace the URLs in “Product URL Catalog,” adjust batching in “Batch Item Iterator,” and change your alert rule in “Check Price Threshold.” Common tweaks include daily runs during sales season, adding a “target price” field per SKU, and sending the Slack alert to different channels based on product category.
Usually it’s an API token issue or missing permissions on the workspace. Double-check that the token can write to the price_tracker table and that your field names match exactly (product_name, product_url, current_price, scrape_date). If you recently changed a field type in Baserow, n8n can start throwing “field mismatch” errors until you remap the fields in the Baserow node.
If you self-host n8n, there’s no execution limit (it mostly depends on your server and how fast sites respond). On n8n Cloud, your monthly execution cap depends on the plan, and scraping tends to use more time per item than basic API calls. Practically, many teams start with 20–100 product URLs weekly, then expand once they’re confident in scrape reliability.
For scraping-based workflows, n8n is often a better fit because you can batch, branch, and add “if this fails, log it” logic without turning it into an expensive multi-step bill. You also get the option to self-host, which matters when you scale from 10 products to 200. Zapier and Make can be great for simple API-to-API sync, but scraping is messier, and you’ll want the extra control. If your team already lives in Mailchimp and Slack, this setup keeps the outputs familiar while the mechanics stay in the background. Talk to an automation expert if you want help choosing the cleanest approach.
Once this is running, you stop “checking prices” and start making decisions with a reliable weekly signal. Set it up, let it collect history, and use the time you get back for work that actually compounds.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.