🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 21, 2026

Notion to Google Drive, backups you can trust

Lisa Granqvist Partner Workflow Automation Expert

You don’t notice how fragile Notion feels until you need a restore. A key database gets changed, a view is overwritten, someone deletes a property, and now you’re digging through “undo” history hoping it’s enough.

This Notion Drive backups automation hits ops managers and small teams first because they live in databases all day. But agency owners running client workspaces and marketers tracking campaigns in Notion feel the same anxiety when the “source of truth” has no real backup.

This workflow copies your Notion databases into Google Drive, organizes them into clean folders, deletes older backups based on retention rules, and pings you on Telegram so you can actually relax. You’ll see what it does, what you need, and how it fits into a real week of work.

How This Automation Works

See how this solves the problem:

n8n Workflow Template: Notion to Google Drive, backups you can trust

The Challenge: Notion Isn’t a Backup System

Notion is brilliant for running projects, storing client assets, tracking content, and keeping databases that power day-to-day decisions. But it’s also easy to accidentally break. A “quick cleanup” turns into a missing column. A filter is changed and suddenly someone thinks rows disappeared. Even when nothing goes wrong, teams still ask the same question: “If this workspace got messed up today, how fast could we restore it?” Honestly, most people don’t have a good answer because manual exports are annoying, inconsistent, and easy to forget.

It adds up fast. Here’s where it breaks down in real life.

  • Manual Notion exports are a “someday” task, so backups tend to be missing right when you need them.
  • Files end up scattered in Drive with random names, which makes restore day feel like detective work.
  • Teams back up “important databases” but forget the supporting ones, so you never have a complete snapshot.
  • Without retention rules, old backups pile up and nobody knows which folder is safe to delete.

The Fix: Automated Notion Database Backups in Drive

This workflow creates a repeatable backup routine for Notion that does not depend on someone remembering. It starts by generating a fresh Google Drive folder for the backup run (so each run stays clean and self-contained). Then it pulls a list of your Notion databases and loops through them in batches, fetching pages for each database and compiling the content into a backup file. Once the archive is built, it uploads the backup to Drive and also uploads a small metadata file that acts like a receipt (what was backed up, when, and where it landed). After that, it checks your retention settings, finds older backup folders, and deletes the ones that are past your cutoff. Finally, you get a Telegram message confirming the run, so you’re never guessing.

The workflow kicks off from a manual trigger in n8n (you can later schedule it). From there, Notion data is collected and packaged, Google Drive stores the archive and metadata, and Telegram confirms success or flags cleanup actions. No more “I think we backed it up last month.”

What Changes: Before vs. After

Real-World Impact

Say you back up 10 Notion databases each week. Manually, exporting, naming, and filing each one is maybe 10 minutes per database, plus another 20 minutes to double-check you didn’t miss anything, so you’re spending about 2 hours weekly. With this workflow, you kick it off (or schedule it), wait for Drive uploads to finish, and you’re done in about 5 minutes of active effort. That’s roughly an afternoon saved every month, and your backups are actually consistent.

Requirements

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • Notion for database access and exports.
  • Google Drive to store backups and metadata files.
  • Telegram bot token + chat ID (get them from BotFather and your Telegram chat).

Skill level: Intermediate. You’ll connect credentials and paste a few IDs (Notion integration access, Drive folder ID, Telegram chat ID).

Need help implementing this? Talk to an automation expert (free 15-minute consultation).

The Workflow Flow

Manual run starts the backup. You click “Execute” in n8n (many teams later swap this for a schedule), and a configuration step sets things like naming and retention behavior.

A fresh Drive folder is created. Google Drive generates a new “backup run” folder so every backup stays grouped, which makes restores far less painful.

Notion databases are retrieved and processed in batches. The workflow pulls your database list, loops through each one, fetches its pages, and builds a backup archive plus a small metadata file that summarizes what happened.

Files upload, then cleanup runs, then you get a Telegram message. The archive and metadata are saved to Drive, older backup folders are listed and filtered based on age, and the workflow deletes what no longer fits your retention rules. A Telegram alert closes the loop so you don’t have to check Drive manually.

You can easily modify retention rules to keep backups longer, or change the folder naming to match your client/project structure. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Manual Trigger

This workflow starts manually, making it ideal for on-demand backups while you validate configuration.

  1. Add the Manual Start Trigger node as the trigger to initiate the workflow on demand.
  2. Leave all parameters at default since Manual Start Trigger has no required fields.

Step 2: Connect Notion and Google Drive

These nodes fetch Notion databases and create a Google Drive backup folder for storage.

  1. Open Generate Drive Folder and set Name to =Notion_Backup_V2_{{ $now.format('yyyy-MM-dd_HHmmss') }}.
  2. In Generate Drive Folder, set Folder ID to ={{ $json.parent_folder_id }} and keep Resource as folder.
  3. Open Retrieve Notion Databases and set Resource to database, Operation to getAll, and Return All to true.
  4. Credential Required: Connect your googleDriveOAuth2Api credentials in Generate Drive Folder (and reuse the same credentials in all Google Drive nodes).
  5. Credential Required: Connect your notionApi credentials in Retrieve Notion Databases and Fetch Database Pages.
⚠️ Common Pitfall: The parent folder ID comes from Backup Configurator. If it is left as YOUR_GOOGLE_DRIVE_PARENT_FOLDER_ID, folder creation will fail.

Step 3: Set Up Backup Configuration and Tracking

These nodes define backup preferences and prepare the database list for batch processing.

  1. Open Backup Configurator and set backup_format to json.
  2. Set retention_days to 5 and enable_cleanup to true.
  3. Set parent_folder_id to your Google Drive parent folder ID and telegram_chat_id to your target Telegram chat ID.
  4. Keep Initialize Backup Tracker as-is; it uses Generate Drive Folder to attach folder metadata for each database.
  5. In Iterate Databases Batch, keep defaults; it controls the loop over each Notion database.

Step 4: Configure Backup File Generation and Upload

This section retrieves Notion pages, builds a JSON archive, and uploads each backup into the Drive folder.

  1. In Fetch Database Pages, set Resource to databasePage, Operation to getAll, Return All to true, and Database ID to ={{ $json.database_id }}.
  2. Leave Build Backup File unchanged; it composes JSON content and generates a binary file for upload.
  3. In Upload Backup Archive, set Name to ={{ $json.file_name }} and Folder ID to ={{ $('Generate Drive Folder').first().json.id }}.
  4. Understand the flow: Iterate Databases Batch sends each batch to Fetch Database Pages, then Build Backup File, then Upload Backup Archive, and loops back to Iterate Databases Batch until complete.

Step 5: Configure Metadata, Cleanup Logic, and Notifications

These nodes assemble metadata, optionally delete old backups, and send a Telegram alert.

  1. In Assemble Backup Metadata, keep defaults; it summarizes backup statistics and creates backup_metadata.json.
  2. In Send Metadata File, set Name to backup_metadata.json and Folder ID to ={{ $('Generate Drive Folder').first().json.id }}.
  3. In Cleanup Decision, confirm the condition checks ={{ $('Backup Configurator').first().json.enable_cleanup }} is true.
  4. In List Previous Backups, set Resource to fileFolder, Return All to true, and Query String to Notion_Backup_.
  5. In Folder Deletion Check, ensure the condition checks ={{ $json.has_folders }} equals true before deleting.
  6. In Remove Old Backup, keep Operation as deleteFolder and the folder ID value as ={{ $json.id }}.
  7. In Dispatch Telegram Alert, set Text to ={{ $json.message }} and Chat ID to ={{ $json.chat_id }}.
  8. Credential Required: Connect your telegramApi credentials in Dispatch Telegram Alert.
Google Drive credentials are used by multiple nodes: Generate Drive Folder, Upload Backup Archive, Send Metadata File, List Previous Backups, and Remove Old Backup. Use the same googleDriveOAuth2Api account for consistency.

Step 6: Test and Activate Your Workflow

Validate the full backup cycle before enabling it for production use.

  1. Click Execute Workflow on Manual Start Trigger to run a full backup.
  2. Confirm a new Google Drive folder named like Notion_Backup_V2_YYYY-MM-DD_HHmmss is created and contains JSON backup files.
  3. Verify backup_metadata.json is uploaded to the same folder.
  4. Check Telegram for the message sent by Dispatch Telegram Alert with the summary and backup folder link.
  5. When satisfied, toggle the workflow to Active for production use.
🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Watch Out For

  • Notion credentials and sharing are the usual culprit. If a database won’t export, confirm it was shared with your Notion integration in Notion’s “Connections” settings.
  • If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
  • Telegram notifications fail silently when the chat ID is wrong. Check the Telegram node execution output first, then regenerate your bot token if needed.

Common Questions

How quickly can I implement this Notion Drive backups automation?

About 30 minutes if your Notion, Drive, and Telegram accounts are ready.

Can non-technical teams implement this backup automation?

Yes. You won’t write code, but you will copy a few IDs and connect credentials in n8n.

Is n8n free to use for this Notion Drive backups workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Telegram (free) and standard Notion/Google account costs.

Where can I host n8n to run this automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

How do I adapt this Notion Drive backups solution to my specific challenges?

Start with the “Backup Configurator” step because that’s where naming and retention behavior are set. You can also adjust the cleanup logic by changing the “Cleanup Decision” and “Folder Deletion Check” conditions, which is useful if you want to keep monthly backups longer than weekly ones. If your Drive needs client-specific folders, tweak “Generate Drive Folder” to nest backups under a client directory. Many teams also customize the Telegram message in “Compose Notification” so it includes the Drive folder link and a count of databases backed up.

Why is my Notion connection failing in this workflow?

Usually the databases were never shared with the Notion integration, so n8n can’t “see” them. Recheck the integration in Notion, then confirm the right workspace is selected in your Notion credential. If it fails partway through, it can also be rate limiting from pulling lots of pages at once, so reduce batch size in the loop and run it again.

What’s the capacity of this Notion Drive backups solution?

In practice it scales to dozens of databases per run, and the main limit is how many Notion pages you’re pulling. On n8n Cloud Starter you can run a healthy number of executions per month for a single workspace, and higher plans handle more volume. If you self-host, there’s no execution cap, but your server and Notion’s API limits become the bottleneck. If you have very large databases, schedule backups during quiet hours and keep the batch size modest.

Is this Notion Drive backups automation better than using Zapier or Make?

Often, yes. This workflow uses looping, filtering, file building, and retention cleanup in one place, and that kind of logic gets expensive or awkward in simpler automation tools. n8n also gives you a self-hosting path, which is handy if you want unlimited runs and tighter control over data. Zapier or Make can still be fine for a lightweight “export one thing” use case, but the moment you need batched backups and cleanup, n8n tends to feel more practical. If you’re torn, Talk to an automation expert and we’ll map it to your volume and risk level.

Backups aren’t exciting. But the first time you need one, you’ll be glad this is running in the background, quietly doing the boring work for you.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Launch login modal Launch register modal