🔓 Unlock all 10,000+ workflows & prompts free Join Newsletter →
✅ Full access unlocked — explore all 10,000 AI workflow and prompt templates Browse Templates →
Home n8n Workflow
January 22, 2026

LinkedIn to Slack, a clean team digest every week

Lisa Granqvist Partner Workflow Automation Expert

Scrolling LinkedIn “for research” sounds reasonable until you realize it quietly stole your morning. Then the best posts vanish into a mess of browser tabs, saved items, and half-remembered names.

This LinkedIn Slack digest automation hits marketing managers hardest, but founders and agency leads feel it too. You get a weekly, readable team digest in Slack, built from the exact profiles you care about, so everyone stays aligned without anyone doing the scrolling.

Below, you’ll see how the workflow pulls posts from your Google Sheets watchlist, summarizes them into tight bullets, and delivers a clean Slack update (with source links attached).

How This Automation Works

See how this solves the problem:

n8n Workflow Template: LinkedIn to Slack, a clean team digest every week

The Challenge: Turning LinkedIn Noise into Team Signal

LinkedIn is great at one thing: putting a lot of “maybe useful” content in front of you. The problem is what happens after you find something good. You paste a link into Slack with no context, or you tell yourself you’ll share it later, then it disappears into the feed. Even if someone tries to be the “insights person,” the effort becomes a weekly chore: collecting posts, summarizing them, grouping them by author, and trying not to miss anything important.

It adds up fast. Here’s where it breaks down in real teams.

  • You lose strong posts because there’s no consistent place to capture and track them.
  • Manual summaries turn into a mini writing assignment, so the “digest” slips week after week.
  • People don’t click raw links, which means the value of what you found never spreads.
  • Without a shared cadence, everyone consumes different ideas and meetings drift.

The Fix: A Weekly LinkedIn Digest Posted to Slack

This workflow turns your “we should keep up with LinkedIn” intention into a repeatable system. On a weekly schedule, it reads a list of LinkedIn profile URLs from Google Sheets, then pulls each profile’s recent posts using an Apify-powered scraper. The text is cleaned so the summaries aren’t stuffed with broken links, hashtags, or formatting junk. Next, OpenAI condenses each post into short bullets (2–3 bullets, with each bullet kept very short) and the whole digest is trimmed to stay lean, capped at about 500 words total. Finally, the workflow posts a neatly formatted digest into Slack and replies in a thread with the original source links, so anyone can go deeper without cluttering the main channel.

The workflow starts with a Cron schedule (by default, Sunday morning). It loops through your sheet-based watchlist in batches, summarizes everything, then pushes the finished Slack digest plus link threads to the channel you choose.

What Changes: Before vs. After

Real-World Impact

Say your team tracks 20 LinkedIn profiles and you want the last week’s best posts. Manually, it’s easy to spend about 10 minutes per profile skimming, opening posts, and grabbing links, which is roughly 3 hours weekly (and that’s before you summarize anything). With this workflow, you update the watchlist in Google Sheets in a couple minutes, then the weekly run compiles and summarizes everything automatically. You wake up to a Slack digest that takes about 2 minutes to read.

Requirements

  • n8n instance (try n8n Cloud free)
  • Self-hosting option if you prefer (Hostinger works well)
  • Google Sheets to store the LinkedIn profile watchlist.
  • Apify to scrape recent posts from profiles.
  • Slack to deliver the weekly digest to a channel.
  • OpenAI API key (get it from your OpenAI dashboard).

Skill level: Intermediate. You’ll connect accounts, paste API tokens, and adjust a few text/prompt settings safely.

Need help implementing this? Talk to an automation expert (free 15-minute consultation).

The Workflow Flow

A weekly schedule kicks it off. The Cron trigger runs on a set cadence (Sunday morning by default), so the digest arrives when your team is likely to plan the week.

Your watchlist comes from Google Sheets. The workflow reads the profile URLs you maintain in a tab, then processes them in small batches so the run stays stable even as your list grows.

Posts are fetched, cleaned, and summarized. An HTTP request launches the Apify scraper to pull recent posts (up to about 10 per profile from the last 7 days). Code steps strip clutter and normalize text, then an OpenAI chat model rewrites each post into compact bullets and trims the overall digest to stay under the word cap.

Slack gets the digest, plus sources. The workflow assembles a readable Slack message, splits it into chunks if needed, posts it to your chosen channel, then replies in a thread with the original post links so the main message stays clean.

You can easily modify the schedule to run daily, or change the Slack destination to DMs instead of a channel based on your needs. See the full implementation guide below for customization options.

Step-by-Step Implementation Guide

Step 1: Configure the Cron Trigger

Set the workflow to run weekly using the scheduled trigger that kicks off profile collection.

  1. Open Weekly Schedule Trigger and keep the default schedule with triggerTimes set to the item that includes hour: 9.
  2. Confirm Weekly Schedule Trigger connects to Fetch Sheet Profiles as the first step in the flow.

Step 2: Connect Google Sheets

Load LinkedIn profile URLs from a spreadsheet to feed the batch loop.

  1. Open Fetch Sheet Profiles and set Document to __GOOGLE_SHEETS_CREDENTIAL_ID__ and Sheet to gid=0.
  2. Credential Required: Connect your googleSheetsOAuth2Api credentials in Fetch Sheet Profiles.
  3. Verify Fetch Sheet Profiles outputs to Batch Profile Loop.

Step 3: Configure Batch Scraping and Digest Compilation

Process profiles in batches, scrape LinkedIn posts, and assemble a raw digest for summarization.

  1. In Batch Profile Loop, set Batch Size to 5 so profiles are processed in small groups.
  2. Open Launch Apify Scraper and set URL to https://api.apify.com/v2/acts/apimaestro~linkedin-profile-posts/run-sync-get-dataset-items?token=[CONFIGURE_YOUR_TOKEN].
  3. Set Method to POST and Body (JSON) to ={ "username": "{{ $json.profileUrl }}", "page_number": 1, "limit": 3, "maxItems": 20, "total_posts": 3, "post_type": "regular", "includePostReactions": true, "includePostComments": false, "includePostShares": true, "extendOutputFunction": "async function extendOutputFunction({ item }) { const now=Date.now(); const weekMs=7*24*60*60*1000; const ts=(item?.posted_at?.timestamp)||(item?.postedAt?.timestamp)||0; const postType=item?.post_type||item?.postType||null; if(!ts||now-ts>weekMs||postType!=='regular') return null; const pick=(o,k)=>Object.fromEntries(Object.entries(o||{}).filter(([kk])=>k.includes(kk))); const author=pick(item?.author,['first_name','last_name','headline','username','profile_url','profile_picture']); const stats=pick(item?.stats,['total_reactions','like','support','love','insight','celebrate','comments','reposts']); const media=item?.media?pick(item.media,['type','url','thumbnail','images']):null; const out={ urn:item?.urn||null, full_urn:item?.full_urn||item?.fullUrn||null, posted_at:item?.posted_at||item?.postedAt||null, text:item?.text||null, url:item?.url||null, post_type:postType, author, stats, media, username: author?.profile_url||null }; return out; }" }.
  4. Ensure Launch Apify Scraper flows into Compile Batch Digest, then Clean Markdown Text, and finally Extract Digest Date.

⚠️ Common Pitfall: Replace [CONFIGURE_YOUR_TOKEN] in Launch Apify Scraper with your real Apify token, or the request will fail.

Step 4: Set Up AI Summarization and Markdown Assembly

Summarize the cleaned digest with OpenAI and format it for Slack delivery.

  1. Open Summarize via Model and select Model gpt-5-mini.
  2. Credential Required: Connect your openAiApi credentials in Summarize via Model.
  3. Confirm the message content uses the expressions ={{ $json.date || "This Week" }} and {{ $json.text }} as shown in the node.
  4. Verify Summarize via Model outputs to Assemble Markdown Digest, which builds the final markdown and text fields.

Tip: The digest length is capped in Assemble Markdown Digest with a MAX value of 3800 to stay under Slack limits.

Step 5: Configure Slack Delivery and Threaded Source Links

Split the digest into chunks for Slack, post it to a channel, and add source links in a threaded reply.

  1. In Split Digest Chunks, keep the chunking logic intact so it outputs { text, part, total } items for Slack.
  2. Open Post Slack Digest and set Text to =**LinkedIn Digest (part {{$json.part}}/{{$json.total}})**\n\n{{$json.text}}.
  3. Credential Required: Connect your slackApi credentials in Post Slack Digest and set Channel to [YOUR_ID].
  4. In Prepare Source Links, confirm const SOURCE_NODE = 'Launch Apify Scraper'; and keep the workflow set to “Run Once for All Items” in this node’s settings.
  5. Open Post Thread Links and set Text to ={{ $json["linksText"] }}, then set thread_ts to ={{ $('Post Slack Digest').item.json.message.ts }} and Channel to [YOUR_ID].
  6. Credential Required: Connect your slackApi credentials in Post Thread Links.

Tip: The flow is sequential: Split Digest ChunksPost Slack DigestPrepare Source LinksPost Thread Links, then the loop continues via Batch Profile Loop.

Step 6: Test and Activate Your Workflow

Run a manual test to validate data flow, summaries, and Slack delivery before turning on the schedule.

  1. Click Execute Workflow and inspect output at Compile Batch Digest, Summarize via Model, and Assemble Markdown Digest to verify the digest content.
  2. Confirm Slack posts are created by Post Slack Digest and that Post Thread Links replies in the same thread.
  3. If the digest is empty, check Launch Apify Scraper and confirm the {{ $json.profileUrl }} field exists in your Google Sheet.
  4. Once validated, switch the workflow to Active so Weekly Schedule Trigger runs it automatically.
🔒

Unlock Full Step-by-Step Guide

Get the complete implementation guide + downloadable template

Watch Out For

  • Google Sheets credentials can expire or need specific permissions. If things break, check the Google connection inside n8n’s Credentials first, then confirm the sheet is still shared to the same Google account.
  • If you’re using Wait-like timing (or Apify runs take longer than usual), processing times vary. Bump up delays or batch sizes if downstream steps fail because the scraper returned late or empty.
  • Default prompts in the OpenAI summarizer are generic. Add your brand voice and “what to ignore” rules early (reposts, promos, job updates), or you’ll be editing outputs forever.

Common Questions

How quickly can I implement this LinkedIn Slack digest automation?

About 30 minutes if you already have the API keys.

Can non-technical teams implement this digest workflow?

Yes. You’ll mostly connect accounts, paste tokens, and pick the Slack channel. No coding is required unless you want deeper filtering.

Is n8n free to use for this LinkedIn Slack digest workflow?

Yes. n8n has a free self-hosted option and a free trial on n8n Cloud. Cloud plans start at $20/month for higher volume. You’ll also need to factor in OpenAI API costs (often a few cents per weekly run) and Apify usage for scraping.

Where can I host n8n to run this automation?

Two options: n8n Cloud (managed, easiest setup) or self-hosting on a VPS. For self-hosting, Hostinger VPS is affordable and handles n8n well. Self-hosting gives you unlimited executions but requires basic server management.

How do I adapt this LinkedIn Slack digest solution to my specific challenges?

You can change the schedule in the Cron trigger, adjust batch size in the profile loop, and tighten summaries by editing the prompt in the OpenAI summarization step. Common tweaks include excluding reposts or sponsored content in the cleaning/code steps, limiting to certain authors, or switching Slack delivery from a channel post to a DM.

Why is my Apify connection failing in this workflow?

Usually it’s an expired or incorrect Apify API token. Update the token in n8n, then verify the specific Apify actor you’re calling is still available and your plan allows the run volume. If the actor returns empty results, check that the LinkedIn profiles are accessible to Apify (public or otherwise reachable) and that you’re not hitting rate limits.

What’s the capacity of this LinkedIn Slack digest solution?

It scales well for most small teams because it batches profiles and caps the digest at about 500 words.

Is this LinkedIn Slack digest automation better than using Zapier or Make?

Often, yes, because this workflow has scraping, batching, cleanup logic, and chunked Slack posting. n8n handles that kind of “real workflow” without getting weirdly expensive as you add branches and code steps. Zapier or Make can work for simpler alerting, but they’re not as comfortable when you need to loop through a watchlist and manage formatting limits. Also, self-hosting n8n can remove execution caps entirely, which matters once you expand the number of tracked profiles. If you’re torn, Talk to an automation expert and you’ll get a straight recommendation.

Once this is running, your team gets the signal without the scroll. Set it up, let it post every week, and keep your attention for work that actually moves things forward.

Need Help Setting This Up?

Our automation experts can build and customize this workflow for your specific needs. Free 15-minute consultation—no commitment required.

Lisa Granqvist

Workflow Automation Expert

Expert in workflow automation and no-code tools.

×

Use template

Get instant access to this n8n workflow Json file

💬
Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Get a free quote today!
Get a free quote today!

Tell us what you need and we'll get back to you within one working day.

Launch login modal Launch register modal