🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 22, 2026

Google Docs to Google Sheets, exam scores logged

Lisa Granqvist Partner Workflow Automation Expert

Marking exams is the kind of work that quietly eats your week. You collect scans, decipher handwriting, cross-check an answer key, then retype results into a spreadsheet (and still worry you missed something).

This is where exam grading automation helps most. A teacher trying to keep up with multiple classes feels it first, but school admins and tutoring center owners deal with the same backlog. The outcome is simple: faster grading, consistent scoring, and a clean paper trail for rechecks.

This workflow uses AI to read a scanned answer sheet, compare it to your Google Docs answer key, calculate marks, and log everything in Google Sheets. You’ll see exactly how it works and what you need to run it.

How This Automation Works

The full n8n workflow, from trigger to final output:

n8n Workflow Template: Google Docs to Google Sheets, exam scores logged

The Problem: Exam Grading Turns Into Spreadsheet Drudgery

Grading isn’t just “checking answers.” It’s finding the right file, zooming in on blurry scans, second-guessing handwriting, and then doing the same comparisons again when a student asks for a recheck. After that, you still have admin work: totals, question-by-question notes, and a spreadsheet someone else can understand later. And if more than one person grades, the scoring style drifts. One small inconsistency becomes ten parent emails. Honestly, the mental load is worse than the time.

It adds up fast. Here’s where it breaks down in real classrooms and training programs.

  • You end up retyping the same student details and totals into Google Sheets for every paper.
  • When scans are low quality, you spend extra minutes per page just trying to read what the student wrote.
  • Rechecks are painful because you don’t have a tidy question-level log of what was marked right or wrong.
  • Even careful graders make copy-paste mistakes, and those mistakes usually show up at the worst time.

The Solution: AI Grades Scans and Logs Results Automatically

This workflow automates the full exam evaluation loop using n8n, Gemini document analysis, and your existing Google Workspace files. A teacher submits an exam through a form along with a scanned answer sheet, and the workflow immediately sends that scan to Gemini to extract the student’s answers. Those extracted answers are then handed to an evaluation agent that also has access to your question paper and your correct answer sheet in Google Docs. The agent compares each response, counts correct versus incorrect, calculates total marks, and produces structured grading output that’s easy to store and review later. Finally, the workflow writes the outcome into Google Sheets twice: once as a clean summary row, and again as a detailed, question-by-question report you can audit.

The workflow starts with an exam submission form and a scan upload. Gemini reads the document, then the evaluation agent scores it against your Google Docs answer key. Google Sheets gets updated with both a summary view and a detailed trail, so you can recheck without regrading.

What You Get: Automation vs. Results

Example: What This Looks Like

Say you grade 30 scanned papers for a 25-question quiz. Manually, even a quick review takes maybe 5 minutes per paper, plus another 2 minutes to enter totals and notes into Sheets, which is roughly 3.5 hours. With this workflow, the “work” is submitting the form (about 2 minutes per student if you’re uploading in batches) and waiting for processing. In practice, you can finish the whole set in under an hour of hands-on time, with the detailed log already waiting in Google Sheets.

What You’ll Need

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • Google Sheets for summary and detailed score logs
  • Google Docs to store the question paper and answer key
  • Google Gemini access (get it from Google AI Studio / your Google Cloud project)

Skill level: Intermediate. You’ll connect Google accounts, add API credentials, and map a few fields from the form into your sheets.

Don’t want to set this up yourself? Talk to an automation expert (free 15-minute consultation).

How It Works

Exam submission kicks it off. A form trigger collects the teacher’s name and the scanned answer sheet, so grading starts the moment a scan is submitted.

The scan is turned into usable answers. Gemini Document Analysis reads the uploaded file and extracts the student’s responses from the image/PDF, even when the scan isn’t perfect.

AI evaluation does the comparison. An evaluation agent pulls in the question paper and correct answer sheet from Google Docs, cross-checks the extracted responses, then calculates correct/incorrect counts and total marks.

Results land in Google Sheets twice. A summary row is appended for quick reporting, and a second sheet receives the detailed question-by-question breakdown for transparency and rechecks.

You can easily modify the grading rules to match your marking scheme based on your needs. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Form Trigger

This workflow starts when an examiner submits the form and uploads the answer sheet image.

  1. Add and open Exam Submission Form.
  2. Set Form Title to Examiner and Form Description to Examiner AI Agent.
  3. In Form Fields, create a text field with Field Label Examiner Name and mark it required.
  4. Add a file field labeled Upload Answer Sheet and set Accept File Types to .png,.jpg.

Step 2: Connect Google Docs

The evaluation agent uses two Google Docs tools to fetch the question paper and answer key.

  1. Open Fetch Question Document and set Operation to get.
  2. Set Document URL to =https://docs.google.com/document/d/[YOUR_ID]/edit.
  3. Credential Required: Connect your googleDocsOAuth2Api credentials in Fetch Question Document.
  4. Open Retrieve Answer Key and set Operation to get.
  5. Set Document URL to https://docs.google.com/document/d/[YOUR_ID]/edit.
  6. Credential Required: Connect your googleDocsOAuth2Api credentials in Retrieve Answer Key.

Note: Fetch Question Document and Retrieve Answer Key are AI tools used by Evaluation Orchestrator. Credentials are configured on these tool nodes, while the agent references them during execution.

Step 3: Set Up AI Processing

These nodes analyze the uploaded answer sheet, orchestrate evaluation, and parse structured output.

  1. Open Image Response Analyzer and set Operation to analyze, Resource to image, and Input Type to binary.
  2. Set Binary Property Name to Upload_Answer_Sheet and Text to =The image is an answer paper of a student, you need to analyze and pick each and every answer along with section name, question number and student name..
  3. Credential Required: Connect your googlePalmApi credentials in Image Response Analyzer.
  4. Open Gemini Chat Engine and set Model Name to models/gemini-2.5-pro.
  5. Credential Required: Connect your googlePalmApi credentials in Gemini Chat Engine.
  6. Open Evaluation Orchestrator and set Text to =Marks of the student:\n{{ $json.content.parts[0].text }}\nExaminer Name:{{ $('Exam Submission Form').item.json['Examiner Name'] }}\n\n.
  7. Ensure Prompt Type is define and Has Output Parser is enabled.
  8. Open Structured Result Parser and keep the JSON Schema Example aligned with the required output format shown in the node.

Note: Gemini Chat Engine is connected as the language model for Evaluation Orchestrator — ensure credentials are added to Gemini Chat Engine, not the agent. Structured Result Parser is a sub-node used by Evaluation Orchestrator and does not take credentials directly.

Step 4: Configure Output/Action Nodes

Results are appended to summary and detailed Google Sheets, with a code step reshaping data for per-question rows.

  1. Open Append Summary Row and set Operation to append.
  2. Select the target spreadsheet and sheet: Document ID [YOUR_ID] and Sheet Name gid=0 (cached name Sheet1).
  3. Map columns to expressions, for example Total Marks to {{ $json.output['Total Marks'] }} and Student Name to {{ $json.output['Student Name'] }}.
  4. Credential Required: Connect your googleSheetsOAuth2Api credentials in Append Summary Row.
  5. Open Merge Results to JSON and keep the JS Code as provided to transform the structured output into per-question rows.
  6. Open Append Detailed Scores and set Operation to append.
  7. Select the detailed sheet: Document ID [YOUR_ID] and Sheet Name 760946435 (cached name Scorecard).
  8. Map fields such as Status to {{ $json.Status }} and Question Number to {{ $json.Question }}.
  9. Credential Required: Connect your googleSheetsOAuth2Api credentials in Append Detailed Scores.

Execution Note: Evaluation Orchestrator outputs to both Append Summary Row and Merge Results to JSON in parallel. Merge Results to JSON then feeds Append Detailed Scores.

Step 5: Test and Activate Your Workflow

Run a full submission to confirm AI extraction and sheet updates work as expected.

  1. Click Execute Workflow and submit the Exam Submission Form with a sample answer sheet image.
  2. Verify Image Response Analyzer produces extracted text from the uploaded image.
  3. Confirm Append Summary Row adds a summary entry and Append Detailed Scores adds per-question rows to your sheets.
  4. If output is missing, check that Fetch Question Document and Retrieve Answer Key document URLs are valid and accessible.
  5. Toggle the workflow to Active to enable production submissions.
🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Common Gotchas

  • Google Sheets credentials can expire or need specific permissions. If things break, check the credential status inside n8n and confirm the Google account can edit the target spreadsheet first.
  • If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
  • Default prompts in AI nodes are generic. Add your brand voice early or you’ll be editing outputs forever.

Frequently Asked Questions

How long does it take to set up this exam grading automation?

About 30 minutes if your Google files are ready.

Do I need coding skills to automate exam grading automation?

No. You’ll mostly connect accounts, paste in credentials, and map fields into Google Sheets.

Is n8n free to use for this exam grading automation workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in Gemini API usage costs, which depend on how many pages you process.

Where can I host n8n to run this exam grading automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

Can I customize this exam grading automation workflow for negative marking and partial credit?

Yes, but you’ll want to adjust the evaluation instructions inside the Evaluation Orchestrator agent so it applies your marking rules consistently. You can also tweak the Structured Result Parser so it always returns the exact fields you need (for example, “marks_awarded” per question). Common customizations include negative marking, accepting multiple correct answers, and adding rubric-based scoring for short responses.

Why is my Google Sheets connection failing in this exam grading automation workflow?

Usually it’s expired Google OAuth credentials or the spreadsheet was moved to a Drive folder the connected account can’t access. Reconnect the Google Sheets credential in n8n, then confirm the exact spreadsheet and worksheet still exist. If the workflow used to work and suddenly stopped, permissions changes in Google Drive are a very common cause. Rate limits can also show up when you append lots of detailed rows at once, so batching helps.

How many exam papers can this exam grading automation handle?

A typical setup handles a full class set comfortably, and you can scale higher by batching submissions and running during off-hours.

Is this exam grading automation better than using Zapier or Make?

For AI-heavy grading, n8n is usually the better fit because you can orchestrate multi-step logic, structured parsing, and multiple writes to Google Sheets without paying per tiny step. You also have the option to self-host, which matters when you’re processing lots of exam submissions in a short window. Zapier or Make can still work for simpler “upload file, send email, add row” flows, and their setup can feel faster at first. If you’re handling detailed per-question logs, though, those platforms get messy quickly. Talk to an automation expert if you want a recommendation based on your volume.

Once this is running, grading becomes a review task instead of a clerical one. The workflow handles the repetitive logging, and you get your time back for teaching and feedback.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Launch login modal Launch register modal