Postgres to Google Sheets, chat transcripts in tabs
Your AI agent is shipping. The logs are not. They’re stuck in Postgres, hard to scan, annoying to share, and every “quick review” turns into SQL, exports, and a spreadsheet that still needs cleanup.
This Postgres Sheets transcripts automation hits product managers first, because they need patterns fast. But support leads chasing “what actually happened?” and marketers reviewing real customer language feel it too. You get clean, shareable transcripts in Google Sheets, organized by session.
Below you’ll see exactly how the workflow moves data from your chat history table into one tab per session, what you need to run it, and where teams usually trip up.
How This Automation Works
The full n8n workflow, from trigger to final output:
n8n Workflow Template: Postgres to Google Sheets, chat transcripts in tabs
flowchart LR
subgraph sg0["When clicking ‘Test workflow’ Flow"]
direction LR
n0@{ icon: "mdi:play-circle", form: "rounded", label: "When clicking ‘Test workflow’", pos: "b", h: 48 }
n2["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/postgres.svg' width='40' height='40' /></div><br/>Get conversations by sessionId"]
n3["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>Duplicate template sheet"]
n4@{ icon: "mdi:database", form: "rounded", label: "Clear Sheet Content", pos: "b", h: 48 }
n5["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/httprequest.dark.svg' width='40' height='40' /></div><br/>Rename Sheet"]
n6@{ icon: "mdi:play-circle", form: "rounded", label: "Schedule Trigger", pos: "b", h: 48 }
n7@{ icon: "mdi:swap-vertical", form: "rounded", label: "Set session_id", pos: "b", h: 48 }
n8["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/postgres.svg' width='40' height='40' /></div><br/>Postgres - Get session ids"]
n9@{ icon: "mdi:swap-vertical", form: "rounded", label: "Loop Over Session IDs", pos: "b", h: 48 }
n10@{ icon: "mdi:database", form: "rounded", label: "Add conversations", pos: "b", h: 48 }
n5 --> n7
n7 --> n4
n6 --> n8
n10 --> n9
n4 --> n2
n4 --> n3
n9 --> n4
n3 --> n5
n8 --> n9
n2 --> n10
n0 --> n8
end
subgraph sg1["Flow 2"]
direction LR
n1["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/postgres.svg' width='40' height='40' /></div><br/>add create_at column"]
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n0,n6 trigger
class n2,n4,n8,n10,n1 database
class n3,n5 api
classDef customIcon fill:none,stroke:none
class n2,n3,n5,n8,n1 customIcon
The Problem: Chat transcripts are trapped in the database
When your agent conversations live in Postgres (or Supabase), the data is technically safe, but practically invisible. A PM wants to review five sessions before a roadmap meeting. A support lead needs one “bad” conversation to train the team. Someone runs a query, exports CSV, pastes into Sheets, and then wastes time fixing timestamps, ordering messages, and splitting sessions into separate tabs. And the next day, you do it again because the logs changed.
It adds up fast. The friction compounds.
- Pulling session logs manually usually means at least 30 minutes of query, export, and formatting per request.
- Messages get reviewed out of order, because a CSV export doesn’t “feel” like a conversation.
- Sharing raw database access isn’t an option for most teams, so reviews bottleneck on whoever can run SQL.
- Once people start copy-pasting, small errors creep in and you lose trust in what you’re looking at.
The Solution: Export every session into its own Google Sheets tab
This workflow pulls conversation history from your n8n_chat_histories table in Postgres and turns it into something your whole team can actually use. It starts on a schedule (daily is common) or with a manual run when you need it now. First, it collects the distinct session_id values so it knows which conversations exist. Then it loops through sessions one-by-one, clears the matching tab if it already exists, duplicates a template tab, renames it to the session ID, and appends the full message history as clean rows. The final result is a Google Sheet where every session is a dedicated, readable transcript.
The workflow starts when the schedule trigger fires (or you click manual run). Postgres queries fetch session IDs, then messages for each session. Finally, Google Sheets is rebuilt so the transcript is always current, without you touching CSVs.
What You Get: Automation vs. Results
| What This Workflow Automates | Results You’ll Get |
|---|---|
|
|
Example: What This Looks Like
Say your agent has about 40 sessions a day, and each session has 15–30 messages. Manually, even “just grabbing a few examples” often turns into 10 minutes per session between querying, exporting, splitting, and formatting, so you can burn about 6 hours in a week without noticing. With this workflow, you spend maybe 10 minutes once setting up the Sheet template and credentials, then the daily run happens automatically. You open Google Sheets and the tabs are already there, named by session ID, ready to review.
What You’ll Need
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Postgres to store chat memory logs in a table.
- Google Sheets to review transcripts in shared tabs.
- Google OAuth credentials (create in Google Cloud Console, then connect in n8n).
Skill level: Intermediate. You’ll connect database credentials, set up Google auth, and adjust one SQL query if your schema differs.
Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).
How It Works
A schedule (or manual run) kicks things off. You can run it daily at noon, hourly, or only when you need a fresh batch of transcripts for a meeting.
Session IDs are pulled from Postgres. The workflow queries your n8n_chat_histories table to get distinct session_id values, then loops through them so each conversation gets handled consistently.
Each session gets a dedicated tab. Using a Google Sheets template tab, the workflow duplicates it, renames it to the session ID (via an HTTP request to the Sheets API), then clears any old rows so the transcript is rebuilt cleanly.
Messages are appended as readable rows. For each session, a second Postgres query fetches messages, then Google Sheets appends them with columns like Who (user/assistant), Message, and Date (formatted from created_at).
You can easily modify what identifies a “session” to group by something persistent like user_id based on your needs. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Manual Trigger
This workflow can be launched manually or on a schedule, so configure both triggers to match your operational needs.
- Add the Manual Launch Trigger node to allow on-demand runs.
- Add the Scheduled Run Trigger node and set the schedule rule to run at
12(hourly configuration uses triggerAtHour =12). - Ensure both Manual Launch Trigger and Scheduled Run Trigger connect to Retrieve Session IDs.
Step 2: Connect Postgres
These nodes query and maintain your session history table in Postgres.
- Open Retrieve Session IDs and confirm Query is set to
select distinct(session_id) from n8n_chat_historieswith Operation set toexecuteQuery. - Open Fetch Session Messages and set the Table to
n8n_chat_histories, Schema to=public, and add a Where condition: Column =session_id, Value ={{ $('Iterate Session List').item.json.session_id }}. - If you plan to run the optional maintenance, configure Utility: Add Created Timestamp with the query
ALTER TABLE ONLY "n8n_chat_histories" ADD COLUMN "created_at" TIMESTAMP DEFAULT NOW();. - Credential Required: Connect your
postgrescredentials to Retrieve Session IDs, Fetch Session Messages, and Utility: Add Created Timestamp.
Step 3: Connect Google Sheets
These nodes prepare and update a sheet tab for each session in your conversation log spreadsheet.
- In Purge Sheet Rows, set Document ID to your spreadsheet ID (replace
[YOUR_ID]), Sheet Name to{{ $json.session_id }}, and Range toA2:C10000. - In Clone Template Tab, set URL to
=https://sheets.googleapis.com/v4/spreadsheets/{{ $('Purge Sheet Rows').params.documentId.value }}/sheets/0:copyToand ensure the body parameter destinationSpreadsheetId is{{ $('Purge Sheet Rows').params.documentId.value }}. - In Retitle Sheet Tab, set URL to
=https://sheets.googleapis.com/v4/spreadsheets/{{ $('Purge Sheet Rows').params.documentId.value }}:batchUpdateand JSON Body to{ "requests": [{ "updateSheetProperties": { "properties": { "sheetId": {{ $json.sheetId }}, "title": "{{ $('Purge Sheet Rows').item.json.session_id }}", "hidden": false }, "fields": "title, hidden" } }] }. - Credential Required: Connect your
googleSheetsOAuth2Apicredentials to Purge Sheet Rows, Clone Template Tab, and Retitle Sheet Tab.
[YOUR_ID], the workflow will fail when clearing and copying tabs.Step 4: Set Up Session Processing
This step batches session IDs, creates/retitles tabs, and sets the session context for downstream nodes.
- Configure Iterate Session List to process batches from Retrieve Session IDs (default options are sufficient).
- In Assign Session Identifier, add an assignment named session_id with value
{{ $('Purge Sheet Rows').first().json.session_id }}. - Confirm the flow: Retrieve Session IDs → Iterate Session List → Purge Sheet Rows.
- Purge Sheet Rows outputs to both Fetch Session Messages and Clone Template Tab in parallel.
- Confirm the tab setup branch: Clone Template Tab → Retitle Sheet Tab → Assign Session Identifier → Purge Sheet Rows.
Step 5: Configure Output to Google Sheets
Conversation messages are appended to the correct session tab after they are retrieved from Postgres.
- Open Append Conversation Rows and set Operation to
append. - Set Sheet Name to
{{ $('Iterate Session List').item.json.session_id }}and Document ID to the same spreadsheet used in Purge Sheet Rows. - Map columns using the existing expressions: Who =
{{ $json.message.type }}, Date ={{ $json.created_at.toDateTime().format('yyyy-MM-dd hh:mm:ss') }}, Message ={{ $json.message.content }}. - Credential Required: Connect your
googleSheetsOAuth2Apicredentials to Append Conversation Rows.
Step 6: Test and Activate Your Workflow
Run a manual test to confirm session IDs are pulled, tabs are created/cleared, and conversation messages appear correctly.
- Click Execute Workflow using Manual Launch Trigger.
- Verify that Retrieve Session IDs returns session IDs and that each ID is processed by Iterate Session List.
- Confirm that Purge Sheet Rows clears
A2:C10000, Clone Template Tab creates a new tab, and Retitle Sheet Tab names it to the currentsession_id. - Check the spreadsheet to ensure Append Conversation Rows adds rows with Who, Date, and Message.
- Activate the workflow to enable Scheduled Run Trigger for automated exports.
Common Gotchas
- Google Sheets credentials can expire or need specific permissions. If things break, check the n8n Credentials screen and the connected Google account’s access to the target spreadsheet first.
- If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- Postgres schemas drift in real life. If your
created_atcolumn is missing or stored in a different format, the “date” column in Sheets will look wrong until you update the query and mapping.
Frequently Asked Questions
About an hour if your Postgres table and Google credentials are ready.
No. You will mainly paste credentials and confirm the Postgres queries match your table.
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Google API usage (usually negligible for normal spreadsheet writes).
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
Yes, but you’ll want to be deliberate. The workflow uses an “Assign Session Identifier” step where the session key is set before tabs are created. If your memory config supports a persistent identifier (like user_id), map that value there so the sheet tabs represent users, not sessions. Common tweaks include changing the session ID query, adjusting the tab naming rule, and adding extra columns (like conversation rating or ticket ID) when rows are appended.
Usually it’s expired OAuth consent or the wrong Google account. Reconnect the Google Sheets credential in n8n, then confirm the target spreadsheet is shared with that account. Also check that your HTTP Request steps (used for duplicating and renaming tabs) are pointing at the right spreadsheet ID. If it works for a few sessions and then fails, rate limiting can happen when you try to rebuild a lot of tabs at once.
If you self-host n8n, there’s no hard execution limit (it mostly depends on your server and Google API quotas). On n8n Cloud, the practical limit is tied to your monthly executions. In real use, teams commonly process a few hundred sessions per run without issues, as long as they keep the tab creation and writes paced.
Often, yes. The big win is control: looping through sessions, clearing tabs, duplicating a template tab, then appending rows is the kind of multi-step logic that gets clunky (and pricey) in simpler tools. n8n also lets you self-host, which matters when logs grow and you don’t want to pay per tiny step. That said, if you only needed a one-off “export today’s rows to a sheet,” Zapier or Make can be quicker to click together. Talk to an automation expert if you want a recommendation based on your volume.
Once your transcripts land in Sheets automatically, reviews get lighter and decisions get faster. Set it up once, then just open the tab you need.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.