CSV to Google Sheets, always updated without duplicates
You paste a CSV into a spreadsheet, tweak a few columns, and think you’re done. Then the source file changes, duplicates creep in, and your “report” quietly stops being trustworthy.
This CSV Sheets sync hits marketing ops first (lists and weekly dashboards), but agency owners and analysts run into the same mess. The goal is simple: keep a Google Sheet updated from a CSV URL without duplicate rows or constant cleanup.
This workflow pulls a CSV from a URL, turns it into structured rows, creates a unique key, filters what you care about, then appends or updates the right rows in Google Sheets. You’ll see what it automates, what results you can expect, and what to watch out for.
How This Automation Works
The full n8n workflow, from trigger to final output:
n8n Workflow Template: CSV to Google Sheets, always updated without duplicates
flowchart LR
subgraph sg0["When clicking "Execute Workflow" Flow"]
direction LR
n0@{ icon: "mdi:play-circle", form: "rounded", label: "When clicking 'Execute Workf..", pos: "b", h: 48 }
n1@{ icon: "mdi:database", form: "rounded", label: "Upload to spreadsheet", pos: "b", h: 48 }
n2@{ icon: "mdi:swap-vertical", form: "rounded", label: "Add unique field", pos: "b", h: 48 }
n3@{ icon: "mdi:cog", form: "rounded", label: "Import CSV", pos: "b", h: 48 }
n4["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>Download CSV"]
n5@{ icon: "mdi:swap-horizontal", form: "rounded", label: "Keep only DACH in 2023", pos: "b", h: 48 }
n3 --> n2
n4 --> n3
n2 --> n5
n5 --> n1
n0 --> n4
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n0 trigger
class n5 decision
class n1 database
class n4 api
classDef customIcon fill:none,stroke:none
class n4 customIcon
The Problem: CSV imports create messy, duplicate reports
CSV-based reporting looks easy until it becomes routine. Someone downloads a file, uploads it to Google Sheets, and tries to “merge” it with last week’s data by hand. A week later, you’re staring at duplicates because the same records arrived again with tiny differences, or because the sort order changed. Then you waste time hunting for the right rows, deleting extras, and second-guessing every chart. Honestly, the worst part is the uncertainty. You never know if your spreadsheet reflects reality or just the last person’s best guess.
It adds up fast. The friction compounds in a few predictable places.
- Manual imports are easy to forget, so your “live” dashboard becomes stale for days.
- Duplicates sneak in because CSV rows rarely have a clean, single ID field you can rely on.
- Filtering “just what matters” usually happens after the import, which means extra edits every run.
- Google Sheets can slow down when you try to read and write too much data at once, so big CSV files become a headache.
The Solution: Automatically update Google Sheets from a CSV URL
This n8n workflow turns a CSV link into a Google Sheet that stays current without turning into a duplicate-filled junk drawer. It starts by fetching the CSV file from a specific URL (in the sample workflow, it’s an ECDC public dataset). n8n then parses the CSV into structured JSON rows so each column becomes a usable field in the automation. Next, it creates a unique key by combining two fields (country_code and year_week), which becomes the “match key” for Google Sheets updates. After filtering the data down to a manageable slice (DE, AT, CH and weeks in 2023), the workflow writes to Google Sheets using an append-or-update action, so existing rows get updated and new rows get added.
The workflow kicks off on a manual run right now, which is perfect for testing. From there it pulls the CSV via HTTP, converts it, adds a unique identifier, filters to a subset, then updates your sheet using that unique key so you don’t get repeated rows.
What You Get: Automation vs. Results
| What This Workflow Automates | Results You’ll Get |
|---|---|
|
|
Example: What This Looks Like
Say you refresh one CSV-driven dashboard every weekday. Manually, a realistic run looks like: about 10 minutes to download/find the right CSV, 10 minutes to import and fix column types, and 10 minutes to de-dupe and re-check charts. Call it about 30 minutes per day, or roughly 2.5 hours a week. With this workflow, you click execute (or schedule it later), wait a minute or two for processing, and your Google Sheet updates without duplicate rows. That’s a couple hours back, and the numbers stop drifting.
What You’ll Need
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Google Sheets to store and update your rows.
- A CSV URL that returns a downloadable CSV file.
- Google account access with permission to edit the target sheet.
Skill level: Beginner. You’ll connect Google, paste a CSV URL, and confirm your sheet columns match the data.
Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).
How It Works
Manual trigger to test safely. You click “Execute Workflow” in n8n, which is great when you’re validating the CSV format and your sheet mapping.
CSV retrieval and parsing. n8n fetches the CSV file via HTTP Request, then the Spreadsheet File step converts it into structured rows you can work with like normal spreadsheet data.
De-duplication logic through a unique key. A Set step creates a unique_key by combining country_code and year_week. Then a filter keeps only the slice you want (in the sample, DACH countries and 2023 weeks) to avoid hammering Google’s read/write limits.
Google Sheets append-or-update. The final step writes to your chosen spreadsheet using “appendOrUpdate” and matches on unique_key, which means existing rows get updated instead of duplicated.
You can easily modify the filter to include more countries or a different date pattern based on your needs. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Manual Trigger
This workflow starts on demand so you can validate CSV imports before automating them.
- Add and keep Manual Run Trigger as the start node of the workflow.
- Confirm there are no parameters required for Manual Run Trigger.
Step 2: Connect Google Sheets
Configure the destination spreadsheet and authenticate Google Sheets.
- Open Append to Spreadsheet and set Operation to
appendOrUpdate. - Set Document ID to the Google Sheets URL:
https://docs.google.com/spreadsheets/d/[YOUR_ID]. - Set Sheet Name to
COVID-weekly. - In Columns, ensure Mapping Mode is
autoMapInputDataand Matching Columns includesunique_key. - Set Cell Format to
USER_ENTERED. - Credential Required: Connect your googleSheetsOAuth2Api credentials in Append to Spreadsheet.
Step 3: Set Up CSV Retrieval and Parsing
Fetch the CSV file from the public endpoint and parse it into structured rows.
- In Retrieve CSV File, set URL to
https://opendata.ecdc.europa.eu/covid19/testing/csv/data.csv. - Ensure Retrieve CSV File uses file response format so the CSV can be parsed downstream.
- In Parse CSV Data, set File Format to
csv. - Enable Header Row in Parse CSV Data so column names are detected.
- Connect Manual Run Trigger → Retrieve CSV File → Parse CSV Data.
Step 4: Configure Data Enrichment and Filtering
Create a unique key for de-duplication and filter records to DACH countries in 2023.
- In Create Unique Key, add field unique_key with String Value set to
={{ $json.country_code }}-{{ $json.year_week }}. - In Filter DACH 2023, add a string condition where Value 1 is
={{ $json.year_week }}, Operation isstartsWith, and Value 2 is2023. - Add a boolean condition in Filter DACH 2023 with Value 1 set to
={{ ['DE', 'AT', 'CH'].includes($json.country_code )}}and Value 2 set totrue. - Connect Parse CSV Data → Create Unique Key → Filter DACH 2023 → Append to Spreadsheet.
Step 5: Test and Activate Your Workflow
Validate the CSV import and confirm rows are appended or updated based on the unique key.
- Click Execute Workflow to run from Manual Run Trigger.
- Verify Append to Spreadsheet shows successful inserts/updates in the
COVID-weeklysheet. - Confirm that only records matching 2023 and DACH country codes appear in the sheet.
- When satisfied, toggle the workflow Active to enable production use.
Common Gotchas
- Google Sheets credentials can expire or need specific permissions. If things break, check the n8n Credentials section and your Google account’s access to the target spreadsheet first.
- If you later switch the manual trigger to Cron, runs can overlap when the CSV is slow to download. Space schedules out a bit so you don’t get two updates competing for the same sheet.
- That unique_key needs to stay stable. If you change how it’s built (or your CSV headers change), you can accidentally “miss matches” and create a new set of near-duplicates.
Frequently Asked Questions
About 30 minutes if your Google Sheet is ready.
No. You’ll mostly paste the CSV URL and map fields to your sheet columns.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You won’t have extra API costs for the built-in HTTP Request and Google Sheets steps beyond whatever Google account you already use.
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Yes, and it’s straightforward. Update the filter that currently keeps only DE/AT/CH and year_week starting with 2023, then keep the same unique_key approach so updates still match the right rows. Common tweaks include expanding to more country codes, switching the year pattern, or replacing the unique_key formula to match whatever ID fields your CSV actually has.
Most of the time it’s an authorization issue. Reconnect your Google Sheets credentials in n8n and confirm the Google account has edit access to that specific spreadsheet. Also check that you didn’t change the spreadsheet ID or sheet name since the workflow was built, because a renamed tab can look like a “failed connection.” If you’re pushing a lot of rows, you can also run into Google API limits, so filtering down the dataset (like this workflow does) matters.
It depends more on Google Sheets limits than n8n. If you self-host n8n, you won’t have execution caps, but you still want to keep each run reasonably sized to avoid slow writes. Practically, most teams keep each run to a few thousand rows or less, then filter or segment if the CSV is huge.
Often, yes. n8n is a better fit when you need reliable parsing, custom keys for de-duplication, and the ability to self-host if volume grows. Zapier and Make are fine for lighter jobs, but CSV handling can get awkward once you’re transforming data or updating rows based on a match field. With this workflow, the “appendOrUpdate” behavior is the whole point, and n8n handles that logic cleanly. If you’re unsure, Talk to an automation expert and describe your CSV and reporting needs.
Once this is running, your Google Sheet becomes a dependable destination instead of a recurring chore. Set it up, keep the key stable, and move on.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.