Decodo + Google Sheets: track Amazon price drops
You find a “great” Amazon price drop, copy details into a sheet, open the product page, double-check the discount, then do it again. And again. By the time you’re done, the deal window has moved and you still don’t trust the data.
This Amazon price tracking automation hits e-commerce analysts hardest, but affiliate marketers and small product teams feel it too. The point is simple: build a clean, searchable deal history without babysitting tabs all day.
This workflow uses Decodo to scrape price-drop listings, OpenAI to structure and analyze them, then appends everything to Google Sheets. You’ll see what it captures, what it changes in your day-to-day, and what to watch out for.
How This Automation Works
See how this solves the problem:
n8n Workflow Template: Decodo + Google Sheets: track Amazon price drops
flowchart LR
subgraph sg0["Manual Execution Start Flow"]
direction LR
n0@{ icon: "mdi:play-circle", form: "rounded", label: "Manual Execution Start", pos: "b", h: 48 }
n1@{ icon: "mdi:swap-vertical", form: "rounded", label: "Assign Input Parameters", pos: "b", h: 48 }
n2@{ icon: "mdi:swap-vertical", form: "rounded", label: "Expand Array Items", pos: "b", h: 48 }
n3@{ icon: "mdi:swap-vertical", form: "rounded", label: "Iterate Product Batches", pos: "b", h: 48 }
n4@{ icon: "mdi:robot", form: "rounded", label: "Analyze Sentiment Tone", pos: "b", h: 48 }
n5["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/merge.svg' width='40' height='40' /></div><br/>Combine Summary & Sentiment"]
n6@{ icon: "mdi:database", form: "rounded", label: "Modify Google Spreadsheet", pos: "b", h: 48 }
n7@{ icon: "mdi:robot", form: "rounded", label: "Structured Output Reader", pos: "b", h: 48 }
n8@{ icon: "mdi:robot", form: "rounded", label: "LLM Structured Extraction", pos: "b", h: 48 }
n9@{ icon: "mdi:cog", form: "rounded", label: "Aggregate Results", pos: "b", h: 48 }
n10@{ icon: "mdi:cog", form: "rounded", label: "Decodo Scrape Request", pos: "b", h: 48 }
n11@{ icon: "mdi:brain", form: "rounded", label: "OpenAI Chat Engine", pos: "b", h: 48 }
n12@{ icon: "mdi:cog", form: "rounded", label: "Product Page Scraper", pos: "b", h: 48 }
n13@{ icon: "mdi:brain", form: "rounded", label: "OpenAI Sentiment Model", pos: "b", h: 48 }
n14@{ icon: "mdi:robot", form: "rounded", label: "Generate Content Summary", pos: "b", h: 48 }
n15@{ icon: "mdi:brain", form: "rounded", label: "OpenAI Summary Model", pos: "b", h: 48 }
n5 --> n9
n10 --> n8
n9 --> n6
n2 --> n3
n3 --> n12
n1 --> n10
n11 -.-> n8
n14 --> n5
n4 --> n5
n6 --> n3
n12 --> n4
n12 --> n14
n7 -.-> n8
n8 --> n2
n0 --> n1
n15 -.-> n14
n13 -.-> n4
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n0 trigger
class n4,n7,n8,n14 ai
class n11,n13,n15 aiModel
class n6 database
classDef customIcon fill:none,stroke:none
class n5 customIcon
The Challenge: Keeping a trustworthy Amazon deal log
Price drops look easy until you try to track them consistently. One day you capture product names but forget the links. The next day you save links but the price field is messy, or the “savings” value is missing. Then there’s the human part: you’re skimming descriptions, trying to guess if a deal is worth pushing, and leaving vague notes you can’t use later. After a week, your “tracking sheet” turns into a noisy dump that doesn’t help you spot trends or winners.
It adds up fast. Here’s where it breaks down in real work.
- You spend about 10 minutes per product bouncing between a listing page, the product page, and your spreadsheet.
- Unstructured info (titles, prices, savings text) gets pasted in different formats, so filtering and charting becomes a chore.
- It’s easy to miss context, so your “deal notes” are thin and you end up re-researching the same item later.
- When the workload spikes, you stop tracking for a few days, which ruins any attempt at trend analysis.
The Fix: Scrape, structure, and log price drops automatically
This workflow creates a simple pipeline: pull a list of Amazon price drops (via a CamelCamelCamel “daily drops” URL), enrich each product with deeper page data, and write the results into a Google Sheet you can actually use. It starts with an input URL you control, then Decodo scrapes the listing content and grabs the raw product details. Next, OpenAI turns that messy text into structured fields (think: consistent name, current price, savings, and product link). After that, the workflow loops through each product URL, scrapes the product page, and runs two AI analyses: a readable summary plus sentiment and key topics. Finally, everything is merged, aggregated, and appended to your “Pricedrop Info” spreadsheet so your deal history stays clean automatically.
The workflow starts when you run it manually (or schedule it later). Decodo captures price-drop items, then OpenAI formats the output into predictable JSON fields. Each product gets revisited for enrichment, and the final dataset lands in Google Sheets ready for filtering, notes, and trend tracking.
What Changes: Before vs. After
| What This Eliminates | Impact You’ll See |
|---|---|
|
|
Real-World Impact
Say you track 20 price-drop products each morning. Manually, even a “quick” process is maybe 10 minutes per item once you open the listing, click into the product, copy fields, and write a note, which is about 3 hours. With this workflow, you paste the drop URL once, let it run, and review the finished rows in Google Sheets; your “hands-on” time becomes closer to 15 minutes, plus waiting for processing in the background. That’s roughly 2+ hours back on days you’re actively tracking.
Requirements
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Decodo for scraping Amazon price-drop pages
- Google Sheets to store your deal history
- OpenAI API key (get it from the OpenAI dashboard)
Skill level: Intermediate. You’ll connect credentials, install a community node, and edit a few input fields.
Need help implementing this? Talk to an automation expert (free 15-minute consultation).
The Workflow Flow
Run trigger and input setup. You start the workflow manually, and it reads your price_drop_url (the default points to CamelCamelCamel daily drops, but you can swap it).
Scrape the drop listings. Decodo pulls the listing page and returns the products it finds, including titles, pricing info, savings text, and links. This is the raw material, which tends to be messy.
Structure and enrich the data. OpenAI formats the scraped content into clean JSON fields, then the workflow loops through each product in batches and scrapes the product page for more context. Two AI passes follow: a summary you can skim and sentiment insights (tone, score, and key topics) that help with quick triage.
Write to Google Sheets. The workflow merges the summary and sentiment outputs, aggregates results, and appends new rows to your target sheet (the “Pricedrop Info” spreadsheet in the provided setup).
You can easily modify the price_drop_url to track a different drop feed based on your needs. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Manual Trigger
Start the workflow with a manual trigger and define the base URL used for scraping price drops.
- Add and open Manual Execution Start to confirm it is the trigger node.
- Open Assign Input Parameters and set price_drop_url to
https://camelcamelcamel.com/top_drops?t=daily. - Connect Manual Execution Start → Assign Input Parameters to match the execution flow.
Step 2: Connect Decodo Scraping
Scrape the price drop page and pass the extracted content into the LLM extraction stage.
- Open Decodo Scrape Request and set URL to
{{ $json.price_drop_url }}. - Keep Headless set to
falseand Markdown set totrue. - Credential Required: Connect your decodoApi credentials in Decodo Scrape Request.
- Connect Assign Input Parameters → Decodo Scrape Request → LLM Structured Extraction.
Step 3: Set Up LLM Structured Extraction
Use an LLM to parse the scraped markdown into structured JSON for downstream processing.
- Open LLM Structured Extraction and set Text to
Extract clean, structured JSON data from the following text: {{ $json.results[0].content }}. - Ensure Has Output Parser is enabled so Structured Output Reader can validate the schema.
- Open Structured Output Reader and confirm the JSON schema matches the required product fields (id, title, price, savings, link).
- Credential Required: Connect your openAiApi credentials in OpenAI Chat Engine.
Step 4: Process Products in Batches and Run Parallel AI Analysis
Split the structured array into individual items, scrape each product page, and run sentiment and summary analysis in parallel.
- Open Expand Array Items and set Field To Split Out to
output. - Connect LLM Structured Extraction → Expand Array Items → Iterate Product Batches.
- Open Product Page Scraper and set URL to
{{ $json.link }}, with Headless set tofalseand Markdown set totrue. - Credential Required: Connect your decodoApi credentials in Product Page Scraper.
- Product Page Scraper outputs to both Analyze Sentiment Tone and Generate Content Summary in parallel.
- For Analyze Sentiment Tone, set Text to
Perform a detailed sentiment analysis on the following content: {{ $json.results[0].content }}. - Credential Required: Connect your openAiApi credentials in OpenAI Sentiment Model (the language model for Analyze Sentiment Tone).
- For Generate Content Summary, set Text to
Generate a clear, accurate, and comprehensive summary of the following content: - {{ $json.results[0].content }}. - Credential Required: Connect your openAiApi credentials in OpenAI Summary Model (the language model for Generate Content Summary).
- Connect both AI nodes into Combine Summary & Sentiment to merge their outputs.
Step 5: Configure Aggregation and Google Sheets Output
Aggregate merged results and append or update them in your Google Sheet, then loop to the next batch.
- Connect Combine Summary & Sentiment → Aggregate Results.
- Open Aggregate Results and keep Aggregate set to
aggregateAllItemData. - Open Modify Google Spreadsheet and set Operation to
appendOrUpdate. - Set Document to
[YOUR_ID]and Sheet toSheet1(gid=0). - Map the output column to
{{ $json.data.toJsonString() }}. - Credential Required: Connect your googleSheetsOAuth2Api credentials in Modify Google Spreadsheet.
- Connect Modify Google Spreadsheet back to Iterate Product Batches to continue batch processing.
Step 6: Test and Activate Your Workflow
Run a manual test to ensure the scrape, extraction, AI analysis, and spreadsheet update all succeed.
- Click Execute Workflow from Manual Execution Start to run a full test.
- Verify Decodo Scrape Request returns markdown content and LLM Structured Extraction produces a structured array.
- Confirm both Analyze Sentiment Tone and Generate Content Summary return valid JSON and merge correctly in Combine Summary & Sentiment.
- Check that Modify Google Spreadsheet appends or updates rows in your target sheet.
- When everything looks correct, toggle the workflow to Active for production use.
Watch Out For
- Decodo credentials can expire or need specific permissions. If things break, check the Decodo dashboard token status and your plan limits first.
- If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- Default prompts in AI nodes are generic. Add your brand voice early or you’ll be editing outputs forever.
Common Questions
About an hour if your accounts and API keys are ready.
Yes, but you’ll want someone comfortable with connecting credentials and testing a few runs. The only “fiddly” part is installing the Decodo community node on self-hosted n8n.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in OpenAI API usage and your Decodo plan for scraping.
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Start by changing the input URL in the “Assign Input Parameters” node so you’re scraping the exact CamelCamelCamel feed you care about (daily, weekly, category pages). If you want more columns, edit the schema in the “Structured Output Reader” so OpenAI returns fields like brand, rating, or availability. You can also tune the “Generate Content Summary” and “Analyze Sentiment Tone” prompts to match how your team evaluates deals, like highlighting resale potential or identifying “promo fluff” language.
Most of the time it’s an invalid or expired API token, so regenerate it in Decodo and update the credential in n8n. It can also be plan limits or blocked targets, especially if you’re scraping too aggressively or hitting the same domain repeatedly. If it fails only on the product-page enrichment step, reduce the batch size and try again. Finally, confirm you’re running self-hosted n8n since the Decodo node is a community node.
If you self-host, capacity mainly depends on your server and your Decodo/OpenAI limits.
Often, yes. Zapier and Make struggle when you need multi-step scraping, looping over dozens of products, and running two AI analyses per item without costs ballooning. n8n is also more flexible for structured outputs (JSON parsing, aggregation, merging) and it can be self-hosted, which is important here because of the Decodo community node. If you’re only logging a handful of deals manually triggered from a form, those tools can be fine. For a real price-intelligence pipeline, n8n fits better. Talk to an automation expert if you want a quick recommendation for your setup.
Once this is running, your spreadsheet stops being a messy scratchpad and starts acting like a real deal database. The workflow handles the repetitive collection and cleanup so you can focus on what to do with the data.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.