🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 22, 2026

Bright Data to Google Sheets, track Amazon price drops

Lisa Granqvist Partner Workflow Automation Expert

Tracking Amazon price drops sounds simple until you’re juggling messy scraped pages, broken scrapers, and a spreadsheet that never stays up to date.

Affiliate marketers feel it when “today’s deal” is already gone. Ecommerce analysts feel it when a competitor changes price and nobody notices for a week. And a brand manager usually ends up stuck translating raw data into something leadership can act on. This Amazon price tracking automation fixes that.

You’ll set up an n8n workflow that scrapes a price-drop source with Bright Data, turns the HTML into clean product rows, adds plain-language summaries plus sentiment, and logs everything in Google Sheets so it’s easy to share.

How This Automation Works

Here’s the complete workflow you’ll be setting up:

n8n Workflow Template: Bright Data to Google Sheets, track Amazon price drops

Why This Matters: Price Drops Change Faster Than Your Reports

Price drop pages are noisy. They’re full of repeating blocks, odd HTML, and “almost the same” product names that make matching a pain. Do it manually and you’ll lose an afternoon to copy-paste, only to realize the price changed again while you were formatting columns. Try a basic scraper and it works… until the site layout changes, a CAPTCHA appears, or you get rate-limited. Then your tracking stops quietly, which is honestly the worst kind of failure because you don’t even know you’re blind.

The friction compounds. Here’s where it usually breaks down.

  • Refreshing a deal page, opening product tabs, and copying fields can take about 5 minutes per item, and that’s on a “good” day.
  • HTML scraping outputs tend to be inconsistent, so you end up cleaning titles, prices, and discounts before you can analyze anything.
  • Most teams never add context like “why this matters,” which means the sheet becomes a graveyard of numbers nobody reads.
  • When you miss the moment, you miss the opportunity: pricing moves, ad bids shift, and your campaign or forecast is suddenly wrong.

What You’ll Build: A Price Drop Intelligence Sheet With Summaries

This workflow acts like a small monitoring engine. It starts by using Bright Data’s MCP client to scrape a price drop source that lists Amazon products and recent price changes. The workflow then hands the raw page content to an LLM step that extracts structured fields like product title, current price, discount, brand, and ratings. After that, it loops through each product, scrapes the detail page, and generates two useful “human” outputs: a concise summary of what changed, and a sentiment read based on review context (so you can see if the drop looks like a win, a warning sign, or just normal fluctuation). Finally, it aggregates the records and updates Google Sheets, plus it can send a webhook update for downstream alerts or dashboards.

The workflow kicks off, scrapes the listing page, then expands each product into individual items for processing. It waits briefly between requests (to stay stable), enriches each product with summary and sentiment, then writes clean rows to Google Sheets so your team can sort, filter, and share without extra cleanup.

What You’re Building

Expected Results

Say you track 40 price-drop items each morning. Manually, if you spend about 5 minutes per product to open pages, copy price/discount, and paste into a sheet, that’s roughly 3 hours gone before real work starts. With this workflow, you trigger one run and let it process in the background (often 20–30 minutes including waits and enrichment). You still review the sheet, but you’re reviewing decisions, not doing data entry.

Before You Start

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • Bright Data for managed scraping via MCP.
  • Google Sheets to store and share the results.
  • Google Gemini API key (get it from Google AI Studio).

Skill level: Intermediate. You’ll be comfortable connecting credentials and following a setup checklist for MCP on a self-hosted n8n.

Want someone to build this for you? Talk to an automation expert (free 15-minute consultation).

Step by Step

A manual run (or your preferred trigger) starts the workflow. The provided workflow uses a manual trigger in n8n, which is perfect for testing, demos, or “run this every morning” routines you later schedule.

Bright Data MCP scrapes the price-drop listing page. Instead of maintaining proxies and fighting blocks yourself, the MCP client retrieves the page content in a way that’s built for real-world scraping conditions.

AI extracts structure, then enriches each product in a loop. The LLM chain parses the listing into product fields, splits them into individual items, and processes them in batches with a wait in between. For each item, the workflow scrapes the detail page and generates a short summary plus a sentiment interpretation.

Google Sheets is updated, and a webhook can notify other systems. The workflow aggregates the enriched records, updates spreadsheet rows, and sends a webhook update so you can connect alerts, dashboards, or downstream automation.

You can easily modify the scrape target and the summary/sentiment prompts to match your niche, your brands, or your reporting style. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Manual Trigger

This workflow starts manually so you can run ad-hoc price drop analyses during setup and testing.

  1. Add the Manual Execution Start node as the trigger.
  2. Leave all fields at their defaults (no parameters are required).
  3. Connect Manual Execution Start to Retrieve MCP Tool List.

Step 2: Connect MCP Tools and Set Input Variables

These nodes initialize MCP tools and set the source URL and webhook destination used throughout the flow.

  1. Open Retrieve MCP Tool List and select your MCP credentials. Credential Required: Connect your mcpClientApi credentials.
  2. In Assign Input Variables, set price_drop_url to https://camelcamelcamel.com/top_drops?t=daily.
  3. In Assign Input Variables, set webhook_notification_url to https://webhook.site/[YOUR_ID].
  4. Connect Assign Input Variables to Scrape Price Drop Source.

⚠️ Common Pitfall: Replace [YOUR_ID] in the webhook URL, or your updates will go to a placeholder endpoint.

Step 3: Scrape the Price Drop Source and Extract Structured Items

This stage scrapes the source page and converts it into structured items using an LLM with a structured parser.

  1. In Scrape Price Drop Source, set Tool Name to scrape_as_markdown and Tool Parameters to ={ "url": "{{ $json.price_drop_url }}" }. Credential Required: Connect your mcpClientApi credentials.
  2. Open LLM Structured Extraction and set Text to =Extract structured data from {{ $json.result.content[0].text }}.
  3. Ensure LLM Structured Extraction has Has Output Parser enabled.
  4. Configure Structured Parse Output with the provided jsonSchemaExample (the sample array of id, title, price, savings, and link fields).
  5. Connect Gemini Chat Model Core as the language model for LLM Structured Extraction. Credential Required: Connect your googlePalmApi credentials.
  6. Connect Structured Parse Output to LLM Structured Extraction as the output parser. (Credentials are added to Gemini Chat Model Core, not the parser.)
  7. Send results from LLM Structured Extraction into Expand Output Items with Field to Split Out set to output.

Step 4: Iterate Items, Delay Requests, and Run Parallel AI Analysis

This loop processes each item, waits to avoid rate limits, scrapes item details, and runs summary and sentiment analysis in parallel.

  1. Connect Expand Output Items to Iterate Through Items to batch over extracted items.
  2. Route the batch output to Delay Processing and set Amount to 10 seconds.
  3. In Scrape Item Detail Loop, set Tool Name to scrape_as_markdown and Tool Parameters to ={ "url": "{{ $json.link }}" }. Credential Required: Connect your mcpClientApi credentials.
  4. Scrape Item Detail Loop outputs to both Content Summary and Sentiment Review in parallel.
  5. In Content Summary, set Chunking Mode to advanced and connect Recursive Text Chunker with Chunk Size set to 5000.
  6. Attach Gemini Model for Summary as the language model for Content Summary. Credential Required: Connect your googlePalmApi credentials.
  7. In Sentiment Review, set Text to =Perform sentiment analysis of {{ $json.result.content[0].text }} and keep the provided inputSchema. Attach Gemini Model for Sentiment as the language model. Credential Required: Connect your googlePalmApi credentials.
  8. Send both AI outputs into Combine Analysis Results for merging.

⚠️ Common Pitfall: If you skip the delay, the scrape tools may hit rate limits when iterating through many items.

Step 5: Aggregate and Deliver Results to Sheets and Webhook

The merged data is aggregated and then sent to both Google Sheets and a webhook endpoint in parallel.

  1. Configure Aggregate Records with Aggregate set to aggregateAllItemData.
  2. Aggregate Records outputs to both Update Spreadsheet Rows and Send Webhook Update in parallel.
  3. In Update Spreadsheet Rows, set Operation to appendOrUpdate, select your spreadsheet, and map the output column to {{ $json.data.toJsonString() }}. Credential Required: Connect your googleSheetsOAuth2Api credentials.
  4. In Send Webhook Update, set URL to ={{ $('Assign Input Variables').item.json.webhook_notification_url }} and enable Send Body.
  5. Set the body parameter response to ={{ $json.data.toJsonString() }}.
  6. Ensure Update Spreadsheet Rows loops back to Iterate Through Items to continue processing the next batch.

Step 6: Test and Activate Your Workflow

Run a manual test to verify scraping, AI analysis, and delivery before activating the workflow.

  1. Click Execute Workflow from Manual Execution Start.
  2. Confirm Scrape Price Drop Source returns markdown content and LLM Structured Extraction outputs a structured array.
  3. Verify that Content Summary and Sentiment Review both complete and merge in Combine Analysis Results.
  4. Check your Google Sheet for new rows and your webhook endpoint for the response payload.
  5. When results look correct, toggle the workflow to Active for production use.
🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Troubleshooting Tips

  • Bright Data credentials can expire or the proxy zone name can be wrong. If runs suddenly return empty pages, confirm your API token and that the “mcp_unlocker” Web Unlocker zone exists in the Bright Data control panel.
  • If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
  • Google Sheets updates can fail due to missing permissions on the spreadsheet or an incorrect worksheet/tab name. Open the n8n Google Sheets credential, re-auth, then double-check the Sheet ID and the target tab in the “Update Spreadsheet Rows” step.
  • Default prompts in AI nodes are generic. Add your brand voice early or you’ll be editing outputs forever.

Quick Answers

What’s the setup time for this Amazon price tracking automation?

About 45 minutes if your Bright Data and Google accounts are ready.

Is coding required for this Amazon price tracking?

No. You’ll connect credentials, set a few variables, and adjust prompts if you want a specific reporting style.

Is n8n free to use for this Amazon price tracking workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Bright Data usage plus Gemini API costs, which are usually a few cents per run depending on how many products you summarize.

Where can I host n8n to run this automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

Can I modify this Amazon price tracking workflow for different use cases?

Yes, but you’ll get the best results by changing two areas. Swap the scrape target in the “Scrape Price Drop Source” and “Scrape Item Detail Loop” steps to point at Walmart, eBay, or your own category pages. Then adjust the “Content Summary” and “Sentiment Review” prompts to output exactly what your team needs (for example, include stock risk, ad angle ideas, or competitor comparisons). You can also remove sentiment entirely if you only care about raw price movement.

Why is my Bright Data connection failing in this workflow?

Usually it’s an invalid API token or the MCP server isn’t running on the machine hosting n8n. Double-check the Bright Data Web Unlocker zone name, then confirm the MCP Client (STDIO) credentials in n8n still point to the right local command and environment variables. If it fails only sometimes, you may be hitting rate limits on the target site, so increasing the wait between batches helps.

What volume can this Amazon price tracking workflow process?

Most teams run it for 20–100 products at a time, and the wait/batching controls keep it stable.

Is this Amazon price tracking automation better than using Zapier or Make?

Often, yes, because this isn’t a simple “app to app” sync. You’re scraping, transforming, looping through items, waiting between requests, and running multi-step AI enrichment, which is where Zapier and Make can get expensive or awkward fast. n8n also gives you a self-hosted option, which matters if you want higher volume without counting every task. If you only need “send me an alert when a single product changes price,” Zapier might be simpler. But for building a real dataset in Sheets, n8n is a better fit. Talk to an automation expert if you want help picking the cleanest approach.

Once this is running, your sheet stays current and readable, even when the source site changes. You get the signal. Not the busywork.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Launch login modal Launch register modal