GitHub to Google Sheets, clean profiles with Slack
You find a promising GitHub profile, open ten tabs, skim repos, copy links, and then lose the thread halfway through. By the time you’re done, you still don’t have a clean, shareable record. Just scattered notes and a half-filled spreadsheet.
Technical recruiters feel this first. But talent ops teams and agency sourcers run into the same mess when they try to scale GitHub Sheets automation beyond a handful of profiles.
This workflow pulls a list of GitHub profile URLs, scrapes the details, and writes a structured report into Google Sheets with one tab per person, while Slack keeps you updated as it runs. You’ll see what breaks in the manual process, what the automation changes, and what you need to run it reliably.
How This Automation Works
See how this solves the problem:
n8n Workflow Template: GitHub to Google Sheets, clean profiles with Slack
flowchart LR
subgraph sg0["When clicking ‘Execute workflow’ Flow"]
direction LR
n0@{ icon: "mdi:play-circle", form: "rounded", label: "When clicking ‘Execute workf..", pos: "b", h: 48 }
n1@{ icon: "mdi:database", form: "rounded", label: "Get row(s) in sheet", pos: "b", h: 48 }
n2@{ icon: "mdi:swap-vertical", form: "rounded", label: "Loop Over Items", pos: "b", h: 48 }
n3@{ icon: "mdi:cog", form: "rounded", label: "Run a workflow task", pos: "b", h: 48 }
n4@{ icon: "mdi:cog", form: "rounded", label: "Get details of a workflow task", pos: "b", h: 48 }
n5["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/code.svg' width='40' height='40' /></div><br/>Code in JavaScript"]
n6@{ icon: "mdi:database", form: "rounded", label: "Create sheet", pos: "b", h: 48 }
n7["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/merge.svg' width='40' height='40' /></div><br/>Merge"]
n8@{ icon: "mdi:swap-vertical", form: "rounded", label: "Split Out", pos: "b", h: 48 }
n9@{ icon: "mdi:swap-vertical", form: "rounded", label: "Split Out1", pos: "b", h: 48 }
n10@{ icon: "mdi:swap-vertical", form: "rounded", label: "Split Out2", pos: "b", h: 48 }
n11@{ icon: "mdi:database", form: "rounded", label: "Clear sheet", pos: "b", h: 48 }
n12@{ icon: "mdi:swap-vertical", form: "rounded", label: "Edit Fields", pos: "b", h: 48 }
n13@{ icon: "mdi:database", form: "rounded", label: "Append row in sheet", pos: "b", h: 48 }
n14["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/merge.svg' width='40' height='40' /></div><br/>Merge1"]
n15["<div style='background:#f5f5f5;padding:10px;border-radius:8px;display:inline-block;border:1px solid #e0e0e0'><img src='https://flowpast.com/wp-content/uploads/n8n-workflow-icons/slack.svg' width='40' height='40' /></div><br/>Send a message"]
n16@{ icon: "mdi:database", form: "rounded", label: "User Data", pos: "b", h: 48 }
n17@{ icon: "mdi:database", form: "rounded", label: "User Links", pos: "b", h: 48 }
n18@{ icon: "mdi:database", form: "rounded", label: "User Repositories", pos: "b", h: 48 }
n7 --> n9
n7 --> n8
n7 --> n10
n14 --> n2
n8 --> n17
n16 --> n14
n9 --> n16
n10 --> n18
n17 --> n14
n11 --> n13
n12 --> n11
n6 --> n12
n2 --> n15
n2 --> n3
n18 --> n14
n5 --> n6
n5 --> n7
n13 --> n7
n1 --> n2
n3 --> n4
n4 --> n5
n0 --> n1
end
%% Styling
classDef trigger fill:#e8f5e9,stroke:#388e3c,stroke-width:2px
classDef ai fill:#e3f2fd,stroke:#1976d2,stroke-width:2px
classDef aiModel fill:#e8eaf6,stroke:#3f51b5,stroke-width:2px
classDef decision fill:#fff8e1,stroke:#f9a825,stroke-width:2px
classDef database fill:#fce4ec,stroke:#c2185b,stroke-width:2px
classDef api fill:#fff3e0,stroke:#e65100,stroke-width:2px
classDef code fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px
classDef disabled stroke-dasharray: 5 5,opacity: 0.5
class n0 trigger
class n1,n6,n11,n13,n16,n17,n18 database
class n5 code
classDef customIcon fill:none,stroke:none
class n5,n7,n14,n15 customIcon
The Challenge: Reviewing GitHub Profiles Without Losing Context
GitHub is rich, but it’s messy when you’re doing review work at speed. Profiles have pinned repos, recent activity, organizations, stars, links, and a long tail of clues that matter only when you can capture them consistently. Manually, you end up with notes that don’t match from person to person. Worse, you revisit the same profile twice because you can’t tell what you already checked. It’s not just time. It’s mental overhead, and it quietly drags down your shortlisting quality.
The friction compounds. A single profile is annoying; a batch of 30 becomes a full afternoon you didn’t plan for.
- Copy-pasting repo names, links, and stats into a sheet invites small errors that later look like “bad candidates.”
- You end up with one giant worksheet that nobody wants to read, so decisions get delayed.
- Manual review makes it hard to compare people fairly, because you never capture the same fields in the same order.
- When a teammate asks “where did this profile come from?”, you’re hunting through Slack threads and browser history.
The Fix: Scrape GitHub Profiles into Structured Google Sheets
This automation starts with a simple input: a Google Sheet containing GitHub profile URLs you want to review. It processes each profile one at a time, sends a Slack message so you know what’s being handled, and uses BrowserAct to scrape the full profile page plus repository and social/link data. Then a JavaScript cleanup step consolidates the raw scrape into a tidy JSON object (honestly, this is the part that makes the output usable). Finally, the workflow creates a dedicated tab for that person inside your reporting spreadsheet, clears it for a fresh run, and writes three clean sections: profile details, links, and repositories. The result is a repeatable “profile dossier” you can share with a hiring manager or keep as an internal record.
The workflow kicks off manually today, but it’s built so you can swap the trigger later (schedule it, or feed it from a sourcing workflow). From there, it’s consistent: read URLs from Sheets, scrape with BrowserAct, clean the data, then write a structured report and notify Slack as each profile runs.
What Changes: Before vs. After
| What This Eliminates | Impact You’ll See |
|---|---|
|
|
Real-World Impact
Say you review 25 GitHub profiles for a role each week. Manually, you might spend about 10 minutes per profile opening repos, grabbing links, and writing a usable summary, which is roughly 4 hours of “research admin.” With this workflow, you paste the URLs into the master Google Sheet, run it, and let BrowserAct do the scraping while Slack keeps you posted. Your time drops to maybe 20 minutes of setup plus quick review of the finished tabs, so you get most of that afternoon back.
Requirements
- n8n instance (try n8n Cloud free)
- Self-hosting option if you prefer (Hostinger works well)
- Google Sheets for the input list and output reports.
- Slack to post progress notifications during the run.
- BrowserAct to scrape GitHub profiles and repositories.
- BrowserAct API key (get it from your BrowserAct dashboard).
Skill level: Intermediate. You’ll connect accounts, paste credentials, and be comfortable testing a run and tweaking a few fields.
Need help implementing this? Talk to an automation expert (free 15-minute consultation).
The Workflow Flow
Manual launch (or scheduled run) starts the process. You trigger the workflow when you’re ready to enrich a batch, typically after you’ve collected profile URLs in your candidate sheet.
Google Sheets provides the source list. The workflow reads rows from your master spreadsheet, pulling in each GitHub profile URL you want to process. If you later want a different source, you can swap this input without rewriting the whole thing.
Each profile runs through a controlled loop. n8n processes one candidate at a time (Split in Batches), posts a Slack alert that the profile is starting, and then calls BrowserAct to run the scraping task. A follow-up request fetches the task details once BrowserAct finishes.
Raw scrape data gets cleaned and organized. A JavaScript Code step consolidates what BrowserAct returns into a single clean object, then the workflow creates (and clears) a dedicated Google Sheets tab named for the user. After that, it splits the cleaned data into three streams so profile info, links, and repos land in predictable sections.
Google Sheets becomes the deliverable. The workflow appends headers, writes the profile section, writes links, writes repos, and merges the paths back together so the loop can continue. Slack notifications keep the run from feeling like a black box.
You can easily modify the source sheet format to support extra columns (role, stage, sourcer) based on your needs. See the full implementation guide below for customization options.
Step-by-Step Implementation Guide
Step 1: Configure the Manual Trigger
Start the workflow manually so you can test the end-to-end enrichment before automating.
- Add and keep Manual Launch Trigger as the starting node.
- Connect Manual Launch Trigger to Retrieve Sheet Rows to feed source URLs into the workflow.
Step 2: Connect Google Sheets
Configure your Google Sheets sources and destination tabs used across the workflow.
- Open Retrieve Sheet Rows and choose the source spreadsheet and sheet for your GitHub URLs.
- Credential Required: Connect your googleSheetsOAuth2Api credentials for Retrieve Sheet Rows.
- Open Generate Sheet Tab and set Operation to
create, with Title set to={{ $json.Name }}. - Credential Required: Connect your googleSheetsOAuth2Api credentials for Generate Sheet Tab, Wipe Sheet Tab, Append Header Row, Write Profile Data, Write Link Data, and Write Repo Data (grouped Google Sheets nodes).
Step 3: Set Up Browser Automation and Parsing
Use BrowserAct to fetch profile data and parse the response into structured JSON.
- Configure Execute Browser Task with Workflow ID set to
[YOUR_ID]and set Target_Page to={{ $json.URL }}. - Credential Required: Connect your browserActApi credentials for Execute Browser Task and Fetch Task Details.
- In Fetch Task Details, set Operation to
getTask, Task ID to={{ $json.id }}, Wait For Finish totrue, Max Wait Time to900, and Polling Interval to20. - Keep Transform Parsed Data as-is to repair malformed JSON and merge objects into a single output.
$input.first().json.output.string.Step 4: Configure Data Splitting and Parallel Processing
Split the parsed profile, links, and repositories into separate branches and rejoin them for batching.
- Confirm Transform Parsed Data outputs to both Generate Sheet Tab and Combine Streams in parallel.
- Verify Combine Streams is set to Mode
chooseBranchwith Use Data Of Input set to2. - Set Separate Links to split on Field To Split Out
Links. - Set Separate Profile to split on Field To Split Out
Nameand includeUsername, Summary, Location. - Set Separate Repos to split on Field To Split Out
Repositories. - Ensure Rejoin Outputs has Number Inputs set to
3to wait for all three branches.
Step 5: Configure Output Destinations and Notifications
Create clean sheet tabs, write data into columns, and send a Slack notification.
- In Assign Empty Fields, confirm the empty strings are assigned for Name, Username, Location, Site, link, Title, Summary, Date, Programing Language, and Stars.
- In Wipe Sheet Tab, set Operation to
clearand Sheet Name to={{ $('Transform Parsed Data').item.json.Name }}. - In Append Header Row, set Operation to
appendand ensure the column values include={{ $('Assign Empty Fields').item.json.Name }}and={{ $('Assign Empty Fields').item.json.Username }}. - In Write Profile Data, map Name to
={{ $json.Name }}, Summary to={{ $json.Summary }}, Location to={{ $json.Location }}, and Username to={{ $json.Username }}. - In Write Repo Data, map Programming Lang to
={{ $json["Programing Language"] }}and Repositories_Name to={{ $json.Title }}. - In Post Slack Alert, set Text to
Users Data Scrapped from Githuband select your channel. - Credential Required: Connect your slackOAuth2Api credentials for Post Slack Alert.
={{ $('Transform Parsed Data').item.json.Name }}. Ensure the Name field is present in the parsed data or sheet creation and writes will fail.Step 6: Test and Activate Your Workflow
Run a manual test to validate browser automation, JSON parsing, and sheet outputs before activating.
- Click Execute Workflow and verify Manual Launch Trigger starts the run.
- Confirm Execute Browser Task and Fetch Task Details return completed tasks with an output string.
- Check that Generate Sheet Tab creates a new sheet tab named after the user and that Wipe Sheet Tab clears it before writing.
- Validate data appears in the correct columns from Write Profile Data, Write Link Data, and Write Repo Data.
- Verify a Slack message posts via Post Slack Alert after Iterate Records completes.
- When successful, switch the workflow to Active to run it in production as needed.
Watch Out For
- BrowserAct credentials can expire or need specific permissions. If things break, check your BrowserAct API key and workspace access in the BrowserAct dashboard first.
- If you’re using Wait nodes or external rendering, processing times vary. Bump up the wait duration if downstream nodes fail on empty responses.
- Google Sheets tab creation can fail when names collide or include odd characters. If you see errors, sanitize the tab name in the Code step and confirm the spreadsheet has edit access for your Google Sheets credential.
Common Questions
If your BrowserAct, Google Sheets, and Slack accounts are ready, plan on about an hour.
Yes. There’s no traditional coding, but someone needs to be comfortable connecting credentials and running a few test profiles. Once it’s working, day-to-day use is basically “add URLs to the sheet and click run.”
Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in BrowserAct usage costs, since scraping runs are billed by BrowserAct.
Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.
You can keep the same GitHub Sheets automation structure and change the inputs and outputs. The easiest edits happen in “Retrieve Sheet Rows” (change what columns you read), “Transform Parsed Data” (rename or add fields you care about), and the three write steps (“Write Profile Data,” “Write Link Data,” “Write Repo Data”) to match your preferred report layout. Common tweaks include adding scoring columns, writing to one shared “Summary” tab in addition to per-user tabs, and filtering out repos below a minimum star count.
Usually it’s an expired API key or the community node not being installed on your self-hosted n8n. Regenerate the BrowserAct key, confirm the workspace and template are accessible, then re-save credentials in n8n. If it fails only on larger batches, it can also be rate limiting or the scrape task taking longer than expected, so the “Fetch Task Details” call returns before results are ready.
It scales to hundreds of profiles as long as your scraping provider and Google Sheets limits are respected, and you’re okay with batch runs that take a while.
Often, yes, because this isn’t just “send data from A to B.” You’re looping over a list, waiting for an external scrape job, transforming messy output, and writing to multiple sections of a spreadsheet. n8n handles that kind of branching and data shaping more naturally, and self-hosting avoids per-task pricing when you run big batches. Zapier or Make can still work if you keep the scope small, but you’ll feel the friction once you add the clean-up logic and per-candidate tab management. Talk to an automation expert if you want a quick recommendation based on your volume and process.
You end up with one clean place to review GitHub candidates, and Slack keeps the run from disappearing into the void. Set it up once, then let the workflow do the busywork while you focus on the decisions.
Need Help Setting This Up?
Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.