Bright Data to Google Sheets, pricing stays current
Checking competitor prices manually sounds simple until it’s Tuesday, you’re juggling five tabs per SKU, and the “one quick check” turns into a messy spreadsheet you don’t fully trust. Miss a price drop for a day and you find out the hard way, usually after your sales slow down.
This price tracking automation hits e-commerce managers hardest, but product marketers and revenue ops leads feel it too. You get daily, structured price intel in Google Sheets, plus Slack and email alerts when a competitor undercuts you beyond your threshold.
Below, you’ll see exactly how the workflow runs, what it replaces, and what you need to set it up in n8n without turning this into a technical project.
How This Automation Works
See how this solves the problem:
n8n Workflow Template: Bright Data to Google Sheets, pricing stays current
flowchart LR
subgraph sg0["Schedule Flow"]
direction LR
n0@{ icon: "mdi:swap-vertical", form: "rounded", label: "Load Competitor URLs", pos: "b", h: 48 }
n1@{ icon: "mdi:swap-vertical", form: "rounded", label: "Loop Through Competitors", pos: "b", h: 48 }
n2["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>Scrape with Bright Data"]
n3@{ icon: "mdi:cog", form: "rounded", label: "Wait for Scraping", pos: "b", h: 48 }
n4["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>Fetch Scraped Data"]
n5["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Parse Price Data"]
n6@{ icon: "mdi:database", form: "rounded", label: "Log to Google Sheets", pos: "b", h: 48 }
n7@{ icon: "mdi:swap-horizontal", form: "rounded", label: "Check If Alert Needed", pos: "b", h: 48 }
n8["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/slack.svg' width='40' height='40' /></div><br/>Send Slack Alert"]
n9@{ icon: "mdi:message-outline", form: "rounded", label: "Send Email Alert", pos: "b", h: 48 }
n10@{ icon: "mdi:cog", form: "rounded", label: "Aggregate All Results", pos: "b", h: 48 }
n11["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Create Daily Summary"]
n12["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/slack.svg' width='40' height='40' /></div><br/>Send Daily Report to Slack"]
n13@{ icon: "mdi:play-circle", form: "rounded", label: "Schedule Trigger", pos: "b", h: 48 }
n5 --> n6
n5 --> n7
n5 --> n1
n13 --> n0
n9 --> n10
n8 --> n10
n3 --> n4
n4 --> n5
n11 --> n12
n0 --> n1
n10 --> n11
n7 --> n8
n7 --> n9
n2 --> n3
n1 --> n2
n1 --> n10
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n13 trigger
class n7 decision
class n6 database
class n2,n4 api
class n5,n11 code
classDef customIcon fill:none,stroke:none
class n2,n4,n5,n8,n11,n12 customIcon
The Challenge: Keeping competitor pricing current without busywork
Competitive pricing research breaks down in boring, expensive ways. You open a list of competitor product pages, copy prices into a sheet, and tell yourself you’ll “do it daily,” right up until meetings pile up and it becomes weekly. Then you’re reacting instead of deciding. And because competitor sites format prices differently (discount banners, bundles, “from $X,” shipping baked in), the data you capture ends up inconsistent, which makes trend tracking feel pointless. Honestly, the worst part is the mental load. You’re never sure if you’re looking at today’s reality or last week’s.
It adds up fast. Here’s where it usually breaks down in day-to-day operations.
- Manual checks turn into a recurring calendar chore that quietly steals about 2 hours a week.
- One missed undercut can linger for days because nobody gets notified at the moment it matters.
- Copy-paste tracking creates messy history, so you can’t confidently answer “is this a trend or a blip?”
- Teams argue about numbers because the source and timestamp aren’t captured cleanly.
The Fix: Automated competitor price monitoring with Bright Data and Google Sheets
This workflow runs on a schedule (by default, every day at 9 AM) and does the price checks for you. It starts with your list of competitor product URLs and your own “current price” reference. For each competitor page, n8n sends a scraping job to Bright Data’s Web Scraper API, waits for completion, then retrieves the result and extracts the price even when the website layout differs. Next, it computes the metrics you actually care about, like the difference between your price and theirs, and it logs every check into Google Sheets so you build a clean history automatically. If a competitor is meaningfully cheaper than you (based on your threshold), it pushes an alert to Slack and sends an email so the right people see it immediately. After the run finishes, it generates a daily summary report and posts that to Slack as well.
The workflow begins with a timed trigger, then loops through each competitor URL in batches. Bright Data handles the scraping lifecycle, and n8n handles the comparison logic, logging, and alerts. Finally, Slack gets both the “something changed” warnings and the daily wrap-up report.
What Changes: Before vs. After
| What This Eliminates | Impact You’ll See |
|---|---|
|
|
Real-World Impact
Say you track 12 competitor URLs across a handful of key SKUs. Manually, you might spend roughly 5 minutes per URL between page loads, discount math, and updating your sheet, which is about an hour each time you run it. Do that five days a week and you’re close to 5 hours of repetitive work. With this automation, the “human time” is closer to 10 minutes: you review the Slack summary and only act on the undercut alerts while Google Sheets logs the rest in the background.
Requirements
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Bright Data for web scraping API access.
- Google Sheets to store price history and timestamps.
- Slack for fast undercut alerts and daily summaries.
- SMTP email provider to send backup alerts.
- Bright Data API token (get it from the Bright Data dashboard).
Skill level: Intermediate. You’ll connect accounts, paste API credentials, and edit a short list of competitor URLs and thresholds.
Need help implementing this? Talk to an automation expert (free 15-minute consultation).
The Workflow Flow
Daily schedule kicks it off. At 9 AM (or whatever time you choose), n8n starts a monitoring run automatically, so the process doesn’t depend on someone remembering.
Your competitor list gets loaded. The workflow pulls in an array of competitor product URLs along with your own reference price and an alert threshold (for example, alert if they’re more than 10% cheaper).
Bright Data scrapes each URL. n8n triggers a Bright Data scraping job, waits for the result, retrieves the dataset output, then standardizes the price so comparisons aren’t thrown off by formatting quirks.
Results get logged and evaluated. Google Sheets receives a new row for each competitor check, then an “if” decision checks the undercut rule. If it’s breached, Slack and email alerts go out; otherwise the run continues quietly.
A daily summary lands in Slack. At the end, the workflow aggregates outcomes, generates a summary (average differences, lowest/highest competitor), and publishes a report so the team has one shared view.
You can easily modify the competitor URL list to monitor more products, or change the alert threshold to fit your margins. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Schedule Trigger
Set the workflow schedule so the price monitoring runs automatically.
- Add or open Timed Start Trigger.
- Define the schedule under Rule to match your desired monitoring interval.
- Confirm the trigger is connected to Configure Rival Links as shown in the flow.
Step 2: Connect Google Sheets
Log results to a spreadsheet for historical tracking and analytics.
- Open Record to Sheets and set Operation to
appendOrUpdate. - Set Document to
[YOUR_ID]and Sheet toPrice History. - Confirm the column mappings use expressions like
{{ $json.competitorUrl }},{{ $json.scrapedAt }}, and{{ $json.priceDifference }}. - Credential Required: Connect your Google Sheets OAuth2 credentials.
[YOUR_ID] unchanged will cause the node to fail. Replace it with your Google Sheet ID.Step 3: Set Up Competitor Inputs and Iteration
Define the competitor list, your price, and alert threshold, then iterate through each competitor.
- In Configure Rival Links, set competitors to the JSON array using
[{ "name": "Competitor A", "url": "...", "productName": "..." }]. - Set ourPrice to
149.99and alertThreshold to10(adjust as needed). - Open Iterate Rival List to ensure it is connected from Configure Rival Links for batch processing.
- Confirm Iterate Rival List routes to Trigger Data Scraper and also outputs to Combine All Outcomes on completion.
Step 4: Configure Scraping and Price Processing
Trigger the Bright Data scraper, wait for completion, then compute price metrics from the results.
- In Trigger Data Scraper, set URL to
https://api.brightdata.com/datasets/v3/triggerand JSON Body to the expression{ "dataset_id": "gd_l7q7dkf244hwjntr0", "endpoint": "https://api.brightdata.com/datasets/v3/snapshot/gd_l7q7dkf244hwjntr0?format=json", "url": "{{ $json.competitors[$itemIndex].url }}", "discover_new_sites": false }. - Enable Send Body and Send Headers and ensure Content-Type is
application/json. - Credential Required: Connect your HTTP Header Auth credentials in Trigger Data Scraper.
- In Pause for Scrape, set Amount to
10to allow the scrape to complete. - In Retrieve Scrape Results, set URL to
{{ $json.snapshot_id ? 'https://api.brightdata.com/datasets/v3/snapshot/' + $json.snapshot_id + '?format=json' : 'https://api.brightdata.com/datasets/v3/progress/' + $json.snapshot_id }}. - Credential Required: Connect your HTTP Header Auth credentials in Retrieve Scrape Results.
- Review Compute Price Metrics to ensure the JavaScript parses price fields and uses
$('Configure Rival Links').item.json.ourPriceand$('Iterate Rival List').item.json.competitors[$itemIndex].
Step 5: Configure Alerts, Aggregation, and Daily Summary
Send real-time alerts for underpriced competitors and publish a daily summary report.
- In Evaluate Alert Criteria, keep the conditions set to
{{ $json.isUnderpriced }}is true and{{ $json.percentageDiff }}greater than{{ $('Configure Rival Links').item.json.alertThreshold }}. - Confirm parallel execution: Compute Price Metrics outputs to Record to Sheets, Evaluate Alert Criteria, and Iterate Rival List in parallel.
- Confirm parallel execution: Evaluate Alert Criteria outputs to both Post Slack Warning and Dispatch Email Notice in parallel.
- In Post Slack Warning, keep the alert message with expressions like
{{ $json.competitorName }}and{{ $json.priceDifference }}. - Credential Required: Connect your Slack credentials in Post Slack Warning.
- In Dispatch Email Notice, set Subject to
Price Alert: {{ $json.competitorName }} - {{ $json.productName }}and update To Email and From Email from[YOUR_EMAIL]. - Credential Required: Connect your SMTP/Email Send credentials in Dispatch Email Notice.
- Ensure Combine All Outcomes aggregates outcomes before Generate Daily Summary creates the report and Publish Daily Slack Report posts it to Slack.
- Credential Required: Connect your Slack credentials in Publish Daily Slack Report.
Step 6: Test and Activate Your Workflow
Run a manual test to validate scraping, alerting, and logging before enabling the schedule.
- Click Execute Workflow to run Timed Start Trigger manually.
- Verify that Record to Sheets appends rows in the
Price Historysheet and includes values likecompetitorPriceandpriceDifference. - Confirm alerts appear in Slack from Post Slack Warning and an email is sent by Dispatch Email Notice when a competitor undercuts your price.
- Check that Publish Daily Slack Report posts the daily summary after Generate Daily Summary runs.
- Once validated, switch the workflow to Active so Timed Start Trigger runs on schedule.
Watch Out For
- Bright Data credentials can expire or your dataset can be misconfigured. If scrapes suddenly return empty data, check your Bright Data dashboard for token status and dataset ID first.
- If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- Slack alerts can fail quietly if the app loses channel permissions. When messages stop showing up, confirm the Slack connection in n8n and re-authorize the workspace if needed.
Common Questions
About 30–40 minutes if your Bright Data, Slack, and Google access is ready.
Yes. You won’t write code, but you will paste API credentials and maintain a clean list of competitor URLs.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Bright Data API usage and any email provider costs.
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Start with the “Configure Rival Links” step where the competitor URLs, your reference price, and the threshold live. If you want a different data destination, you can swap the “Record to Sheets” step to Microsoft Excel 365 or Airtable without changing the scraping logic. Many teams also adjust the Slack message format in the “Post Slack Warning” step so it includes SKU, margin notes, or a direct link to the competitor page.
Usually it’s an invalid or expired API token, or the dataset ID changed in Bright Data. Update the credential in n8n, then run a single competitor URL to confirm the scraper returns data. If you’re scraping a lot of pages at once, rate limiting can also show up as timeouts or empty responses.
It scales to dozens or hundreds of URLs, as long as you tune batching and wait time for Bright Data jobs.
Often, yes, because this isn’t a simple “if price then notify” task. You’re orchestrating scraping jobs, waiting for results, parsing prices, writing rows, branching on a threshold, and then generating an aggregated daily report. n8n handles that kind of multi-step logic cleanly, and self-hosting means you’re not paying per tiny step the way many teams end up doing in Zapier. Zapier or Make can still be fine if you already have a reliable price feed and only need basic routing into Slack. If you’re unsure, Talk to an automation expert and map it to your volume and budget.
Once this is running, competitor monitoring turns into a quick review instead of a daily chore. The workflow handles the repetitive checks so your team can focus on pricing decisions that actually move revenue.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.