Bright Data to Google Sheets, track Amazon price drops
Tracking Amazon price drops sounds simple until you’re juggling messy scraped pages, broken scrapers, and a spreadsheet that never stays up to date.
Affiliate marketers feel it when “today’s deal” is already gone. Ecommerce analysts feel it when a competitor changes price and nobody notices for a week. And a brand manager usually ends up stuck translating raw data into something leadership can act on. This Amazon price tracking automation fixes that.
You’ll set up an n8n workflow that scrapes a price-drop source with Bright Data, turns the HTML into clean product rows, adds plain-language summaries plus sentiment, and logs everything in Google Sheets so it’s easy to share.
How This Automation Works
Here’s the complete workflow you’ll be setting up:
n8n Workflow Template: Bright Data to Google Sheets, track Amazon price drops
flowchart LR
subgraph sg0["When clicking ‘Test workflow’ Flow"]
direction LR
n0@{ icon: "mdi:play-circle", form: "rounded", label: "When clicking ‘Test workflow’", pos: "b", h: 48 }
n1@{ icon: "mdi:cog", form: "rounded", label: "Bright Data MCP Client List ..", pos: "b", h: 48 }
n2@{ icon: "mdi:swap-vertical", form: "rounded", label: "Set input fields", pos: "b", h: 48 }
n3@{ icon: "mdi:swap-vertical", form: "rounded", label: "Split Out", pos: "b", h: 48 }
n4@{ icon: "mdi:swap-vertical", form: "rounded", label: "Loop Over Items", pos: "b", h: 48 }
n5@{ icon: "mdi:cog", form: "rounded", label: "Wait", pos: "b", h: 48 }
n6@{ icon: "mdi:robot", form: "rounded", label: "Summarize Content", pos: "b", h: 48 }
n7@{ icon: "mdi:robot", form: "rounded", label: "Sentiment Analysis", pos: "b", h: 48 }
n8@{ icon: "mdi:brain", form: "rounded", label: "Google Gemini Chat Model for..", pos: "b", h: 48 }
n9@{ icon: "mdi:brain", form: "rounded", label: "Google Gemini Chat Model for..", pos: "b", h: 48 }
n10["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/merge.svg' width='40' height='40' /></div><br/>Merge"]
n11@{ icon: "mdi:database", form: "rounded", label: "Update Google Sheets", pos: "b", h: 48 }
n12["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>Webhook Notification for Pri.."]
n13@{ icon: "mdi:brain", form: "rounded", label: "Google Gemini Chat Model", pos: "b", h: 48 }
n14@{ icon: "mdi:robot", form: "rounded", label: "Structured Output Parser", pos: "b", h: 48 }
n15@{ icon: "mdi:robot", form: "rounded", label: "Structure Data Extract Using..", pos: "b", h: 48 }
n16@{ icon: "mdi:cog", form: "rounded", label: "MCP Client for Price Drop Da..", pos: "b", h: 48 }
n17@{ icon: "mdi:cog", form: "rounded", label: "MCP Client for Price Drop Da..", pos: "b", h: 48 }
n18@{ icon: "mdi:cog", form: "rounded", label: "Aggregate", pos: "b", h: 48 }
n19@{ icon: "mdi:robot", form: "rounded", label: "Recursive Character Text Spl..", pos: "b", h: 48 }
n5 --> n17
n10 --> n18
n18 --> n11
n18 --> n12
n3 --> n4
n4 --> n5
n2 --> n16
n6 --> n10
n7 --> n10
n11 --> n4
n13 -.-> n15
n14 -.-> n15
n15 --> n3
n1 --> n2
n19 -.-> n6
n0 --> n1
n16 --> n15
n8 -.-> n6
n9 -.-> n7
n17 --> n6
n17 --> n7
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n0 trigger
class n6,n7,n14,n15,n19 ai
class n8,n9,n13 aiModel
class n11 database
class n12 api
classDef customIcon fill:none,stroke:none
class n10,n12 customIcon
Why This Matters: Price Drops Change Faster Than Your Reports
Price drop pages are noisy. They’re full of repeating blocks, odd HTML, and “almost the same” product names that make matching a pain. Do it manually and you’ll lose an afternoon to copy-paste, only to realize the price changed again while you were formatting columns. Try a basic scraper and it works… until the site layout changes, a CAPTCHA appears, or you get rate-limited. Then your tracking stops quietly, which is honestly the worst kind of failure because you don’t even know you’re blind.
The friction compounds. Here’s where it usually breaks down.
- Refreshing a deal page, opening product tabs, and copying fields can take about 5 minutes per item, and that’s on a “good” day.
- HTML scraping outputs tend to be inconsistent, so you end up cleaning titles, prices, and discounts before you can analyze anything.
- Most teams never add context like “why this matters,” which means the sheet becomes a graveyard of numbers nobody reads.
- When you miss the moment, you miss the opportunity: pricing moves, ad bids shift, and your campaign or forecast is suddenly wrong.
What You’ll Build: A Price Drop Intelligence Sheet With Summaries
This workflow acts like a small monitoring engine. It starts by using Bright Data’s MCP client to scrape a price drop source that lists Amazon products and recent price changes. The workflow then hands the raw page content to an LLM step that extracts structured fields like product title, current price, discount, brand, and ratings. After that, it loops through each product, scrapes the detail page, and generates two useful “human” outputs: a concise summary of what changed, and a sentiment read based on review context (so you can see if the drop looks like a win, a warning sign, or just normal fluctuation). Finally, it aggregates the records and updates Google Sheets, plus it can send a webhook update for downstream alerts or dashboards.
The workflow kicks off, scrapes the listing page, then expands each product into individual items for processing. It waits briefly between requests (to stay stable), enriches each product with summary and sentiment, then writes clean rows to Google Sheets so your team can sort, filter, and share without extra cleanup.
What You’re Building
| What Gets Automated | What You’ll Achieve |
|---|---|
|
|
Expected Results
Say you track 40 price-drop items each morning. Manually, if you spend about 5 minutes per product to open pages, copy price/discount, and paste into a sheet, that’s roughly 3 hours gone before real work starts. With this workflow, you trigger one run and let it process in the background (often 20–30 minutes including waits and enrichment). You still review the sheet, but you’re reviewing decisions, not doing data entry.
Before You Start
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Bright Data for managed scraping via MCP.
- Google Sheets to store and share the results.
- Google Gemini API key (get it from Google AI Studio).
Skill level: Intermediate. You’ll be comfortable connecting credentials and following a setup checklist for MCP on a self-hosted n8n.
Want someone to build this for you? Talk to an automation expert (free 15-minute consultation).
Step by Step
A manual run (or your preferred trigger) starts the workflow. The provided workflow uses a manual trigger in n8n, which is perfect for testing, demos, or “run this every morning” routines you later schedule.
Bright Data MCP scrapes the price-drop listing page. Instead of maintaining proxies and fighting blocks yourself, the MCP client retrieves the page content in a way that’s built for real-world scraping conditions.
AI extracts structure, then enriches each product in a loop. The LLM chain parses the listing into product fields, splits them into individual items, and processes them in batches with a wait in between. For each item, the workflow scrapes the detail page and generates a short summary plus a sentiment interpretation.
Google Sheets is updated, and a webhook can notify other systems. The workflow aggregates the enriched records, updates spreadsheet rows, and sends a webhook update so you can connect alerts, dashboards, or downstream automation.
You can easily modify the scrape target and the summary/sentiment prompts to match your niche, your brands, or your reporting style. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Manual Trigger
This workflow starts manually so you can run ad-hoc price drop analyses during setup and testing.
- Add the Manual Execution Start node as the trigger.
- Leave all fields at their defaults (no parameters are required).
- Connect Manual Execution Start to Retrieve MCP Tool List.
Step 2: Connect MCP Tools and Set Input Variables
These nodes initialize MCP tools and set the source URL and webhook destination used throughout the flow.
- Open Retrieve MCP Tool List and select your MCP credentials. Credential Required: Connect your mcpClientApi credentials.
- In Assign Input Variables, set price_drop_url to
https://camelcamelcamel.com/top_drops?t=daily. - In Assign Input Variables, set webhook_notification_url to
https://webhook.site/[YOUR_ID]. - Connect Assign Input Variables to Scrape Price Drop Source.
⚠️ Common Pitfall: Replace [YOUR_ID] in the webhook URL, or your updates will go to a placeholder endpoint.
Step 3: Scrape the Price Drop Source and Extract Structured Items
This stage scrapes the source page and converts it into structured items using an LLM with a structured parser.
- In Scrape Price Drop Source, set Tool Name to
scrape_as_markdownand Tool Parameters to={ "url": "{{ $json.price_drop_url }}" }. Credential Required: Connect your mcpClientApi credentials. - Open LLM Structured Extraction and set Text to
=Extract structured data from {{ $json.result.content[0].text }}. - Ensure LLM Structured Extraction has Has Output Parser enabled.
- Configure Structured Parse Output with the provided jsonSchemaExample (the sample array of
id,title,price,savings, andlinkfields). - Connect Gemini Chat Model Core as the language model for LLM Structured Extraction. Credential Required: Connect your googlePalmApi credentials.
- Connect Structured Parse Output to LLM Structured Extraction as the output parser. (Credentials are added to Gemini Chat Model Core, not the parser.)
- Send results from LLM Structured Extraction into Expand Output Items with Field to Split Out set to
output.
Step 4: Iterate Items, Delay Requests, and Run Parallel AI Analysis
This loop processes each item, waits to avoid rate limits, scrapes item details, and runs summary and sentiment analysis in parallel.
- Connect Expand Output Items to Iterate Through Items to batch over extracted items.
- Route the batch output to Delay Processing and set Amount to
10seconds. - In Scrape Item Detail Loop, set Tool Name to
scrape_as_markdownand Tool Parameters to={ "url": "{{ $json.link }}" }. Credential Required: Connect your mcpClientApi credentials. - Scrape Item Detail Loop outputs to both Content Summary and Sentiment Review in parallel.
- In Content Summary, set Chunking Mode to
advancedand connect Recursive Text Chunker with Chunk Size set to5000. - Attach Gemini Model for Summary as the language model for Content Summary. Credential Required: Connect your googlePalmApi credentials.
- In Sentiment Review, set Text to
=Perform sentiment analysis of {{ $json.result.content[0].text }}and keep the provided inputSchema. Attach Gemini Model for Sentiment as the language model. Credential Required: Connect your googlePalmApi credentials. - Send both AI outputs into Combine Analysis Results for merging.
⚠️ Common Pitfall: If you skip the delay, the scrape tools may hit rate limits when iterating through many items.
Step 5: Aggregate and Deliver Results to Sheets and Webhook
The merged data is aggregated and then sent to both Google Sheets and a webhook endpoint in parallel.
- Configure Aggregate Records with Aggregate set to
aggregateAllItemData. - Aggregate Records outputs to both Update Spreadsheet Rows and Send Webhook Update in parallel.
- In Update Spreadsheet Rows, set Operation to
appendOrUpdate, select your spreadsheet, and map the output column to{{ $json.data.toJsonString() }}. Credential Required: Connect your googleSheetsOAuth2Api credentials. - In Send Webhook Update, set URL to
={{ $('Assign Input Variables').item.json.webhook_notification_url }}and enable Send Body. - Set the body parameter response to
={{ $json.data.toJsonString() }}. - Ensure Update Spreadsheet Rows loops back to Iterate Through Items to continue processing the next batch.
Step 6: Test and Activate Your Workflow
Run a manual test to verify scraping, AI analysis, and delivery before activating the workflow.
- Click Execute Workflow from Manual Execution Start.
- Confirm Scrape Price Drop Source returns markdown content and LLM Structured Extraction outputs a structured array.
- Verify that Content Summary and Sentiment Review both complete and merge in Combine Analysis Results.
- Check your Google Sheet for new rows and your webhook endpoint for the
responsepayload. - When results look correct, toggle the workflow to Active for production use.
Troubleshooting Tips
- Bright Data credentials can expire or the proxy zone name can be wrong. If runs suddenly return empty pages, confirm your API token and that the “mcp_unlocker” Web Unlocker zone exists in the Bright Data control panel.
- If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- Google Sheets updates can fail due to missing permissions on the spreadsheet or an incorrect worksheet/tab name. Open the n8n Google Sheets credential, re-auth, then double-check the Sheet ID and the target tab in the “Update Spreadsheet Rows” step.
- Default prompts in AI nodes are generic. Add your brand voice early or you’ll be editing outputs forever.
Quick Answers
About 45 minutes if your Bright Data and Google accounts are ready.
No. You’ll connect credentials, set a few variables, and adjust prompts if you want a specific reporting style.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Bright Data usage plus Gemini API costs, which are usually a few cents per run depending on how many products you summarize.
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Yes, but you’ll get the best results by changing two areas. Swap the scrape target in the “Scrape Price Drop Source” and “Scrape Item Detail Loop” steps to point at Walmart, eBay, or your own category pages. Then adjust the “Content Summary” and “Sentiment Review” prompts to output exactly what your team needs (for example, include stock risk, ad angle ideas, or competitor comparisons). You can also remove sentiment entirely if you only care about raw price movement.
Usually it’s an invalid API token or the MCP server isn’t running on the machine hosting n8n. Double-check the Bright Data Web Unlocker zone name, then confirm the MCP Client (STDIO) credentials in n8n still point to the right local command and environment variables. If it fails only sometimes, you may be hitting rate limits on the target site, so increasing the wait between batches helps.
Most teams run it for 20–100 products at a time, and the wait/batching controls keep it stable.
Often, yes, because this isn’t a simple “app to app” sync. You’re scraping, transforming, looping through items, waiting between requests, and running multi-step AI enrichment, which is where Zapier and Make can get expensive or awkward fast. n8n also gives you a self-hosted option, which matters if you want higher volume without counting every task. If you only need “send me an alert when a single product changes price,” Zapier might be simpler. But for building a real dataset in Sheets, n8n is a better fit. Talk to an automation expert if you want help picking the cleanest approach.
Once this is running, your sheet stays current and readable, even when the source site changes. You get the signal. Not the busywork.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.