I Built an AI Content Pipeline That Publishes 4 SEO-Optimized Articles Per Day — Here's the Architecture
I Built an AI Content Pipeline That Publishes 4 SEO-Optimized Articles Per Day — Here's the Architecture I'm a chemical engineer who taught himself to code. Six months ago I started building Catalyst OS — a life optimization platform with 106 free calculators, 225 interactive learning modules, and a premium AI journaling tool. The problem was content. I needed hundreds of articles to drive organic traffic, and writing them manually at 2-3 hours each wasn't going to work. So I built an automated content pipeline that generates, publishes, optimizes for SEO, pings search engines, generates social posts, and notifies me via Telegram — four times a day, zero manual intervention. Here's exactly how it works. The Stack n8n (self-hosted workflow automation) — orchestrates everything Claude Sonnet
I Built an AI Content Pipeline That Publishes 4 SEO-Optimized Articles Per Day — Here's the Architecture
I'm a chemical engineer who taught himself to code. Six months ago I started building Catalyst OS — a life optimization platform with 106 free calculators, 225 interactive learning modules, and a premium AI journaling tool. The problem was content. I needed hundreds of articles to drive organic traffic, and writing them manually at 2-3 hours each wasn't going to work.
So I built an automated content pipeline that generates, publishes, optimizes for SEO, pings search engines, generates social posts, and notifies me via Telegram — four times a day, zero manual intervention.
Here's exactly how it works.
The Stack
-
n8n (self-hosted workflow automation) — orchestrates everything
-
Claude Sonnet 4 (Anthropic API) — generates the actual content
-
Supabase (PostgreSQL) — stores articles, topics, and metadata
-
Next.js 15 — renders articles with SSR and structured data
-
IndexNow API — pings Bing/Yandex for instant indexing
-
Resend — transactional email
-
Telegram Bot API — real-time notifications
Architecture Overview
┌─────────────┐ ┌──────────────┐ ┌─────────────────┐ │ Schedule │────→│ Topic Bank │────→│ Config Merge │ │ (4x daily) │ │ (Supabase) │ │ (prompts + │ └─────────────┘ └──────────────┘ │ guardrails) │ └────────┬────────┘ │ ┌────────▼────────┐ │ Claude Sonnet 4 │ │ (generation) │ └────────┬────────┘ │ ┌──────────────────────────────┤ │ │ ┌───────▼───────┐ ┌───────▼───────┐ │ Parse + SEO │ │ OG Image │ │ Field Extract │ │ Generation │ └───────┬───────┘ └───────────────┘ │ ┌───────▼───────┐ │ Supabase │ │ INSERT │ └───────┬───────┘ │ ┌──────────┼──────────┐ │ │ │ ┌─────▼──┐ ┌────▼───┐ ┌───▼────┐ │IndexNow│ │Social │ │Telegram│ │ Ping │ │ Posts │ │ Notify │ └────────┘ └────────┘ └────────┘┌─────────────┐ ┌──────────────┐ ┌─────────────────┐ │ Schedule │────→│ Topic Bank │────→│ Config Merge │ │ (4x daily) │ │ (Supabase) │ │ (prompts + │ └─────────────┘ └──────────────┘ │ guardrails) │ └────────┬────────┘ │ ┌────────▼────────┐ │ Claude Sonnet 4 │ │ (generation) │ └────────┬────────┘ │ ┌──────────────────────────────┤ │ │ ┌───────▼───────┐ ┌───────▼───────┐ │ Parse + SEO │ │ OG Image │ │ Field Extract │ │ Generation │ └───────┬───────┘ └───────────────┘ │ ┌───────▼───────┐ │ Supabase │ │ INSERT │ └───────┬───────┘ │ ┌──────────┼──────────┐ │ │ │ ┌─────▼──┐ ┌────▼───┐ ┌───▼────┐ │IndexNow│ │Social │ │Telegram│ │ Ping │ │ Posts │ │ Notify │ └────────┘ └────────┘ └────────┘Enter fullscreen mode
Exit fullscreen mode
Step 1: The Topic Bank
I don't let the AI decide what to write about. I maintain a topic_bank table in Supabase with pre-planned topics:
CREATE TABLE topic_bank ( id UUID PRIMARY KEY DEFAULT gen_random_uuid(), title TEXT NOT NULL, dimension TEXT, -- mind, body, heart, wealth, spirit concept TEXT, hook TEXT, target_audience TEXT, content_type_slug TEXT, status TEXT DEFAULT 'available', used_at TIMESTAMPTZ );CREATE TABLE topic_bank ( id UUID PRIMARY KEY DEFAULT gen_random_uuid(), title TEXT NOT NULL, dimension TEXT, -- mind, body, heart, wealth, spirit concept TEXT, hook TEXT, target_audience TEXT, content_type_slug TEXT, status TEXT DEFAULT 'available', used_at TIMESTAMPTZ );Enter fullscreen mode
Exit fullscreen mode
A Postgres function get_next_topic_for_generation() picks the next available topic, marks it as processing, and returns it. This prevents duplicate generation if two runs overlap.
Step 2: The Prompt Engineering
This is where most AI content pipelines fall apart. They use a generic "write an article about X" prompt and get generic garbage back. My prompt is ~16,000 characters and includes:
Brand voice rules — no AI-isms ("delve," "unleash," "game-changer"), no filler paragraphs, every claim needs a specific number or citation.
104 calculator links organized by dimension — Claude weaves 3-8 relevant internal links naturally into each article:
const BODY_CALCS = [ '- [TDEE Calculator](https://catalystproject.ai/calculators/body/tdee) — daily energy expenditure', '- [Macro Calculator](https://catalystproject.ai/calculators/body/macros) — protein/carb/fat targets', // ... 27 more ];const BODY_CALCS = [ '- [TDEE Calculator](https://catalystproject.ai/calculators/body/tdee) — daily energy expenditure', '- [Macro Calculator](https://catalystproject.ai/calculators/body/macros) — protein/carb/fat targets', // ... 27 more ];Enter fullscreen mode
Exit fullscreen mode
Featured snippet optimization — question-format headers, numbered lists, bold definitions. These are the patterns Google pulls for position zero.
Structured output format — the prompt requires 13 labeled sections (meta description, subtitle, keywords, hook, problem statement, main content, key takeaways, action steps, success metrics, time to results, evidence level, sources, primary action). The parser extracts each one into its own database column.
Step 3: Content Quality Enforcement
The generation node uses Claude Sonnet 4 with temperature: 0.7. After generation, a Code node parses the response and validates:
// Extract all 13 sections via regex const metaMatch = text.match(/## Meta Description\n([\s\S]*?)(?=\n## )/); const keywordsMatch = text.match(/## Keywords\n([\s\S]*?)(?=\n## )/); // ... etc// Extract all 13 sections via regex const metaMatch = text.match(/## Meta Description\n([\s\S]*?)(?=\n## )/); const keywordsMatch = text.match(/## Keywords\n([\s\S]*?)(?=\n## )/); // ... etc// Quality checks const wordCount = mainContent.split(/\s+/).length; const hasInternalLinks = (mainContent.match(/catalystproject.ai/g) || []).length; const hasSpecificData = /\d+%|\d+ (study|studies|participants|patients)/.test(mainContent);
if (wordCount < 800 || hasInternalLinks < 3 || !hasSpecificData) { throw new Error('Quality check failed'); }`
Enter fullscreen mode
Exit fullscreen mode
Every article gets a featured_image_url set at birth via a dynamic OG image endpoint:
/api/og?title={title}&subtitle={dimension}
Enter fullscreen mode
Exit fullscreen mode
Step 4: SEO That Actually Works
Each article is stored with structured metadata that the Next.js page consumes:
// Automatic on every article page
// OpenGraph publishedTime, modifiedTime, section, tags, authors
// Robots max-image-preview: large, max-snippet: -1`
Enter fullscreen mode
Exit fullscreen mode
The sitemap regenerates with articles at priority 0.85. An IndexNow endpoint pings Bing, Yandex, and Seznam within seconds of publication:
// POST /api/indexnow const urls = body.urls.map(u => // POST /api/indexnow const urls = body.urls.map(u => https://api.indexnow.org/indexnow, {
method: 'POST',
body: JSON.stringify({ host: 'catalystproject.ai', key, urlList: urls }),
});`Enter fullscreen mode
Exit fullscreen mode
Step 5: Social Distribution
After the article is inserted, a separate workflow generates platform-specific social posts (Twitter, LinkedIn) and stores them in a content_pieces table. A scheduled LinkedIn Publisher workflow picks up unposted pieces and publishes them via the LinkedIn REST API.
Step 6: Telegram Notification
Every generated article triggers a Telegram message with title, word count, internal link count, and a direct link to the published article. I review every piece even though it's automated — quality control matters.
Results After 3 Months
-
206 published articles across 5 dimensions
-
106 calculator pages with dimension-aware CTAs on every article
-
Structured data on every page (Article, BreadcrumbList, FAQPage, ProfessionalService)
-
4 articles/day with zero manual writing
-
Average article: 1,200-1,800 words with 4-6 internal links each
What I'd Do Differently
Start with Google Search Console on day one. I built 200+ articles before submitting my sitemap. Those articles sat unindexed for weeks. Submit your sitemap before you have content — Google will discover new pages as they appear.
Don't trust temperature: 1.0 for production content. Higher temperatures produce more creative writing but also more hallucinated citations and inconsistent formatting. 0.7 is the sweet spot for reliable, parseable output.
Internal linking is an architectural decision, not a content decision. Embedding all 104 calculator URLs into the system prompt means every article links to relevant tools without the AI needing to "remember" them. The linking happens at the prompt level, not the content level.
The Full Stack
The entire platform runs on:
-
Next.js 15 + React 19 + TypeScript on Vercel
-
Supabase (PostgreSQL + pgvector for RAG)
-
n8n Cloud (7 workflows: content gen, social posting, lead scraping, enrichment, email drafting, pipeline snapshots, LinkedIn publishing)
-
Claude API for content generation and AI chat
-
Stripe for subscriptions
-
Resend for email
Total monthly infrastructure cost: ~$150. No employees. One codebase.
If you're building something similar or want to see the calculators and content in action, check out catalystproject.ai. The consulting page has details on how I build these systems for other businesses.
Happy to answer questions about the architecture, prompt engineering, or n8n workflow design in the comments.
Sign in to highlight and annotate this article

Conversation starters
Daily AI Digest
Get the top 5 AI stories delivered to your inbox every morning.
More about
claudeavailableproductDesktop Canary v2.1.48-canary.33
🐤 Canary Build — v2.1.48-canary.33 Automated canary build from canary branch. Commit Information Based on changes since v2.1.48-canary.32 Commit count: 1 a7e3d198df 🐛 fix(chat-input): memoize mentionOption/slashOption to prevent freeze on paste ( #13551 ) (Arvin Xu) ⚠️ Important Notes This is an automated canary build and is NOT intended for production use. Canary builds are triggered by build / fix / style commits on the canary branch. May contain unstable or incomplete changes . Use at your own risk. It is strongly recommended to back up your data before using a canary build. 📦 Installation Download the appropriate installer for your platform from the assets below. Platform File macOS (Apple Silicon) .dmg (arm64) macOS (Intel) .dmg (x64) Windows .exe Linux .AppImage / .deb
Desktop Canary v2.1.48-canary.34
🐤 Canary Build — v2.1.48-canary.34 Automated canary build from canary branch. Commit Information Based on changes since v2.1.48-canary.33 Commit count: 1 e364b9a516 ✨ feat: skill store add skills tab ( #13568 ) (Rdmclin2) ⚠️ Important Notes This is an automated canary build and is NOT intended for production use. Canary builds are triggered by build / fix / style commits on the canary branch. May contain unstable or incomplete changes . Use at your own risk. It is strongly recommended to back up your data before using a canary build. 📦 Installation Download the appropriate installer for your platform from the assets below. Platform File macOS (Apple Silicon) .dmg (arm64) macOS (Intel) .dmg (x64) Windows .exe Linux .AppImage / .deb
Desktop Canary v2.1.48-canary.35
🐤 Canary Build — v2.1.48-canary.35 Automated canary build from canary branch. Commit Information Based on changes since v2.1.48-canary.34 Commit count: 2 25cf3bfafd 🐛 fix(userMemories): i18n for purge button ( #13569 ) (Neko) 3cb7206d90 ✨ feat: create new topic every 4 hours ( #13570 ) (Rdmclin2) ⚠️ Important Notes This is an automated canary build and is NOT intended for production use. Canary builds are triggered by build / fix / style commits on the canary branch. May contain unstable or incomplete changes . Use at your own risk. It is strongly recommended to back up your data before using a canary build. 📦 Installation Download the appropriate installer for your platform from the assets below. Platform File macOS (Apple Silicon) .dmg (arm64) macOS (Intel) .dmg (x64) Windows .exe Lin
Knowledge Map
Connected Articles — Knowledge Graph
This article is connected to other articles through shared AI topics and tags.
More in Products
Desktop Canary v2.1.48-canary.33
🐤 Canary Build — v2.1.48-canary.33 Automated canary build from canary branch. Commit Information Based on changes since v2.1.48-canary.32 Commit count: 1 a7e3d198df 🐛 fix(chat-input): memoize mentionOption/slashOption to prevent freeze on paste ( #13551 ) (Arvin Xu) ⚠️ Important Notes This is an automated canary build and is NOT intended for production use. Canary builds are triggered by build / fix / style commits on the canary branch. May contain unstable or incomplete changes . Use at your own risk. It is strongly recommended to back up your data before using a canary build. 📦 Installation Download the appropriate installer for your platform from the assets below. Platform File macOS (Apple Silicon) .dmg (arm64) macOS (Intel) .dmg (x64) Windows .exe Linux .AppImage / .deb
Desktop Canary v2.1.48-canary.34
🐤 Canary Build — v2.1.48-canary.34 Automated canary build from canary branch. Commit Information Based on changes since v2.1.48-canary.33 Commit count: 1 e364b9a516 ✨ feat: skill store add skills tab ( #13568 ) (Rdmclin2) ⚠️ Important Notes This is an automated canary build and is NOT intended for production use. Canary builds are triggered by build / fix / style commits on the canary branch. May contain unstable or incomplete changes . Use at your own risk. It is strongly recommended to back up your data before using a canary build. 📦 Installation Download the appropriate installer for your platform from the assets below. Platform File macOS (Apple Silicon) .dmg (arm64) macOS (Intel) .dmg (x64) Windows .exe Linux .AppImage / .deb
Desktop Canary v2.1.48-canary.35
🐤 Canary Build — v2.1.48-canary.35 Automated canary build from canary branch. Commit Information Based on changes since v2.1.48-canary.34 Commit count: 2 25cf3bfafd 🐛 fix(userMemories): i18n for purge button ( #13569 ) (Neko) 3cb7206d90 ✨ feat: create new topic every 4 hours ( #13570 ) (Rdmclin2) ⚠️ Important Notes This is an automated canary build and is NOT intended for production use. Canary builds are triggered by build / fix / style commits on the canary branch. May contain unstable or incomplete changes . Use at your own risk. It is strongly recommended to back up your data before using a canary build. 📦 Installation Download the appropriate installer for your platform from the assets below. Platform File macOS (Apple Silicon) .dmg (arm64) macOS (Intel) .dmg (x64) Windows .exe Lin



Discussion
Sign in to join the discussion
No comments yet — be the first to share your thoughts!