🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 22, 2026

Postgres to GitHub, versioned CSV backups made easy

Lisa Granqvist Partner Workflow Automation Expert

Manual database exports sound simple until you miss one table, overwrite the wrong file, or realize last week’s “backup” never actually ran.

This Postgres GitHub backup automation hits developers first, but data analysts and ops-minded founders feel it too. You get a clean, versioned CSV snapshot in a GitHub repo every day, without someone babysitting pgAdmin at 6pm.

Below is the exact workflow logic, what it produces, and where teams usually tweak it so it matches their repo structure and naming conventions.

How This Automation Works

The full n8n workflow, from trigger to final output:

n8n Workflow Template: Postgres to GitHub, versioned CSV backups made easy

The Problem: Database “Backups” That Aren’t Actually Reliable

Most small teams don’t fail at backups because they don’t care. They fail because the process is annoying, easy to forget, and weirdly fragile. Someone exports a table, downloads a CSV, renames it “final-final.csv”, and drops it into a shared folder that nobody audits. Then the schema changes. A new table appears. Or a critical table grows and the export takes longer, so it gets skipped “just this once”. A month later, you need last Tuesday’s data and you discover your snapshots are incomplete, inconsistent, or missing entirely.

The friction compounds. Here’s where it breaks down in real life.

  • Exports don’t scale with your schema, so new public tables quietly never get included.
  • CSV files get overwritten or scattered into messy folders, which makes “what changed?” painful to answer.
  • Manual snapshots invite human error, and a single bad export can look “fine” until you try to use it.
  • You lose time doing busywork instead of using the data, and honestly that’s the most expensive part.

The Solution: Daily Postgres-to-GitHub CSV Snapshots

This n8n workflow runs on a daily schedule and turns every table in your Postgres public schema into a CSV file stored in a GitHub repository. First, it checks what’s already in the repo, so it can avoid duplicate files and decide whether it should update an existing table export or create a new one. Then it pulls the current list of tables from Postgres, loops through them in batches, extracts the rows for each table, and converts that data into a CSV file. Finally, it either updates the matching file in GitHub (so history is preserved as commits) or uploads a brand-new CSV if the table didn’t exist yesterday.

The workflow starts with a schedule trigger. In the middle, it compares “tables in Postgres” vs. “files in GitHub” and generates one CSV per table. At the end, GitHub becomes your daily, versioned snapshot source of truth.

What You Get: Automation vs. Results

Example: What This Looks Like

Say you have 20 public tables and you export them manually once a day. Even if each table takes only about 5 minutes to query, export, name correctly, and upload, that’s roughly 100 minutes daily (and that’s on a “nothing went weird” day). With this workflow, the “work” is basically zero: the schedule trigger runs automatically, then you just let it process for a bit in the background. Most teams go from “someone loses an hour every morning” to “check GitHub if you’re curious.”

What You’ll Need

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • Postgres for the source tables in the public schema.
  • GitHub to store versioned CSV snapshots in a repo.
  • Postgres credentials (create a read-only user in your DB)

Skill level: Beginner. You’ll connect Postgres and GitHub in n8n and choose the target repo, no coding required.

Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).

How It Works

A daily schedule kicks things off. n8n runs the workflow every 24 hours, so you don’t rely on someone remembering to do exports after standup.

GitHub is checked first. The workflow lists files in your target repo and compiles the filenames, which means it can decide what needs to be updated versus created from scratch.

Postgres tables are discovered and exported. It queries Postgres for the current list of tables in the public schema, then loops through them in manageable batches, selects rows from each table, and converts the result into a CSV file.

Files are updated or uploaded. If a CSV already exists for that table, the workflow updates the GitHub file (creating a commit). If it’s a new table, it uploads a new file into the repo.

You can easily modify the schema, naming rules, or repo folder structure to match your internal conventions. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Scheduled Trigger

Set the workflow to run automatically on a daily schedule.

  1. Add and open Scheduled Daily Trigger.
  2. Set Rule to an interval with hours and hoursInterval set to 24.

Tip: You can switch the schedule to a fixed time if your backups need to run during off-peak hours.

Step 2: Connect GitHub and Catalog Existing Files

Pull the existing repository filenames so the workflow can decide whether to update or upload backups.

  1. Open List Repo Files GitHub and set Owner to user and Repository to github-repo.
  2. Keep Resource set to file and Operation set to list.
  3. Leave File Path as = to list root files.
  4. Credential Required: Connect your githubOAuth2Api credentials in List Repo Files GitHub.
  5. Open Aggregate Repo Filenames and set Operation to aggregateItems with Fields To Aggregate set to name.

Tip: Confirm the repository contains the expected backup folder structure before you run the workflow.

Step 3: Connect Postgres and Load the Table Catalog

Query your database to list all public tables that should be backed up.

  1. Open Retrieve Table Catalog and set Schema to information_schema.
  2. Set Table to tables and add a Where filter with column table_schema and value public.
  3. Credential Required: Connect your postgres credentials in Retrieve Table Catalog.
  4. Open Iterate Batches (first instance) to process the catalog in batches.

Step 4: Extract Table Records and Build CSV Files

Loop through each table, fetch records, and convert them to CSV.

  1. Open Transform Script and keep JavaScript Code set to return $input.all();.
  2. Open Fetch Table Records and set Operation to select with Return All enabled.
  3. Set Table to the expression {{ $json.table_name }} and Schema to public.
  4. Credential Required: Connect your postgres credentials in Fetch Table Records.
  5. Open Generate CSV File and set Binary Property Name to =data.
  6. Set File Name to {{ $('Retrieve Table Catalog').item.json.table_name }} so each CSV is named after its table.
  7. Connect Generate CSV File to Iterate Batches as shown in the workflow.

⚠️ Common Pitfall: If the table list is empty, verify the Where clause in Retrieve Table Catalog filters public correctly.

Step 5: Split Records and Route to GitHub Update or Upload

Process one CSV at a time and decide whether to update an existing file or upload a new one.

  1. Open Split Single Records and set Batch Size to 1.
  2. Open Verify Repo File Presence and set the string condition to:
  3. Value 1: {{ $node['Aggregate Repo Filenames'].json.name }}
  4. Value 2: {{ $binary.data.fileName }}
  5. Operation: contains
  6. Note that Verify Repo File Presence outputs to both Update GitHub File and Upload GitHub File depending on the condition.

Step 6: Configure GitHub Update and Upload Actions

Write CSV backups into your GitHub repository with unique commit messages.

  1. Open Update GitHub File and set Resource to file and Operation to edit.
  2. Set File Path to {{ $binary.data.fileName }} and enable Binary Data.
  3. Set Commit Message to =backup-{{ $now.toMillis() }}.
  4. Credential Required: Connect your githubOAuth2Api credentials in Update GitHub File.
  5. Open Upload GitHub File and set Resource to file with Binary Data enabled.
  6. Set File Path to {{ $binary.data.fileName }}.
  7. Set Commit Message to =backup-{{ $node['Set commit date'].json.commitDate }}.
  8. Credential Required: Connect your githubOAuth2Api credentials in Upload GitHub File.

⚠️ Common Pitfall: The expression {{ $node['Set commit date'].json.commitDate }} references a node not present in this workflow. Replace it with a valid expression (for example, {{ $now.toMillis() }}) or add the missing node before running.

Step 7: Test & Activate Your Workflow

Validate the end-to-end backup flow and enable the automation.

  1. Click Execute Workflow to run a manual test from Scheduled Daily Trigger.
  2. Confirm that List Repo Files GitHub returns file names and that Retrieve Table Catalog returns public tables.
  3. Verify that Generate CSV File outputs a binary file named after each table.
  4. Check that Verify Repo File Presence routes files to Update GitHub File or Upload GitHub File correctly.
  5. Open your GitHub repository and confirm new or updated CSV files with the expected commit message.
  6. Toggle the workflow Active to enable daily backups.
🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Common Gotchas

  • GitHub credentials can expire or need specific permissions. If things break, check your n8n GitHub credential and the repo access scope first.
  • If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
  • Default prompts in AI nodes are generic. Add your brand voice early or you’ll be editing outputs forever.

Frequently Asked Questions

How long does it take to set up this Postgres GitHub backup automation?

About 30 minutes if your Postgres and GitHub access are ready.

Do I need coding skills to automate Postgres GitHub backup?

No. You connect accounts and adjust a couple of configuration fields in n8n.

Is n8n free to use for this Postgres GitHub backup workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in GitHub and Postgres usage (usually $0 for typical API access).

Where can I host n8n to run this automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

Can I customize this Postgres GitHub backup workflow for a different schema or folder structure?

Yes, and it’s a common tweak. You can change the Postgres catalog query in the “Retrieve Table Catalog” node to target a different schema, and you can adjust the destination path in the “Upload GitHub File” and “Update GitHub File” nodes to write into a subfolder like /snapshots/public/. Many teams also modify the “Transform Script” node to standardize filenames (lowercase, underscores) and to skip tables they don’t want exported.

Why is my GitHub connection failing in this workflow?

Usually it’s an OAuth scope or repo permission issue. Reconnect the GitHub credential in n8n, confirm you granted access to the target repository, and then re-run the “List Repo Files GitHub” node to verify it can read contents. If updates fail but listing works, check whether the workflow is trying to write to a protected branch. Rate limits are also possible if you’re exporting lots of tables at once, so batching helps.

How many tables can this Postgres GitHub backup automation handle?

Dozens is normal, and hundreds can work if your database and repo are sized for it.

Is this Postgres GitHub backup automation better than using Zapier or Make?

Often, yes, because this job is more than a simple two-step zap. n8n handles looping through tables, branching logic, and file updates cleanly, and you can self-host for unlimited executions. Zapier and Make can do parts of it, but you’ll usually end up juggling iterators, file transforms, and pricing limits. If you already live in GitHub and care about version history, n8n is a practical fit. Talk to an automation expert if you’re not sure which fits.

Once this is running, GitHub quietly becomes your daily snapshot library. Set it up, let it commit, and use your time for work that actually moves things forward.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Launch login modal Launch register modal