a ridge on the cerebral cortex
One brain across all
your AI coding tools.
Reads your sessions. Extracts what matters.
Builds a knowledge base that compounds.
curl -fsSL https://gyrus.sh/install | bash
One command. One API key. macOS, Linux & Windows.
Run the install. Gyrus scans your sessions, extracts insights, and merges them into project wikis — automatically.
Building your knowledge base... [1/83] claude-cowork | 2025-03-20 | local_session_a3f... Extracted 7 thoughts [2/83] claude-code | 2025-03-20 | 8f2d1a... Extracted 3 thoughts ── Week 1: 2025-03-17 ── Merging 12 thoughts into 'beacon'... ✓ Updated 'beacon' v1 ── Week 2: 2025-03-24 ── Merging 6 thoughts into 'beacon'... ✓ Updated 'beacon' v2 (deeper, refined) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ Gyrus found and organized 26 projects! ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━_
What a project page looks like
A real-time analytics dashboard for SaaS startups. Connects to Stripe, Segment, and PostHog to surface MRR trends, churn signals, and cohort insights in one view. The competitive moat is sub-second query latency on ClickHouse — most competitors batch hourly.
2025-03-20 Initial concept from Claude Code brainstorm
2025-03-24 Architecture decision: ClickHouse + real-time streaming
2025-04-01 Stripe integration live, onboarding flow redesigned
Reads sessions from your AI coding tools automatically. Just point Gyrus at your home directory and it finds everything.
Your chosen LLM picks out strategic decisions, project context, and architectural choices — not code changes.
Each merge pass makes pages deeper, not just longer. Week-over-week, your knowledge base gets sharper.
Supported tools
Gyrus runs in the background. No new sessions = no API calls = zero cost. It only spends when there's actual new work to process.
The installer sets up a cron job (or Windows Scheduled Task) at whatever frequency you choose — every 30 minutes, hourly, every 4 hours, or daily. Each run checks for new sessions since the last one. If nothing changed, it exits immediately. No LLM calls, no cost.
Gyrus installs a /gyrus slash command into Claude Code and instruction files for Codex. Your AI tools can query the knowledge base mid-session — context flows both ways.
When there are new sessions: ~$0.01–0.04 per run • Frequency is configurable during install • Self-updates with python3 ingest.py --update
Gyrus cares about the decisions, not the diffs.
Extracted
Not extracted
git commit -m "fix typo"
Each merge pass makes your pages deeper, not just longer. Watch a project page evolve over four weeks:
Week 1
"Beacon is a SaaS analytics dashboard"
Week 2
"Beacon targets early-stage startups, using ClickHouse for real-time queries"
Week 3
"Beacon's moat is sub-second latency + Stripe-native cohort analysis"
Week 4
"Beacon needs YC demo by April 1. Ship churn predictor or cut scope."
Knowledge iterates and compounds. That's the point.
Gyrus runs on your machine. You pay your LLM provider directly. No accounts, no cloud, no middleman.
Thought extraction
~$0.01/session
Haiku, GPT-5.4-nano, Gemini Flash
Knowledge merging
~$0.05/page update
Sonnet, GPT-5.4, Gemini Pro
Typical monthly cost
$5-15
Cloud accounts needed
Zero
Choose your models in ~/.gyrus/config.json: Haiku, Sonnet, Opus, GPT-5.4 series, o3, o4-mini, Gemini 3.1 series — or pass any raw model ID. Swap anytime.
Non-project knowledge — your working patterns, tool preferences, recurring decisions — goes into ~/.gyrus/me.md, a personal memory page that also compounds over time.
Gyrus stores everything as plain markdown in ~/.gyrus/. Sync however you want:
iCloud / Dropbox / Google Drive
Point ~/.gyrus/ to a synced folder
Git
cd ~/.gyrus && git init
Obsidian
Set your vault path to ~/.gyrus/
Notion
Optional adapter via --storage=notion
Run the installer on each machine. Same knowledge base, everywhere.
Fundamentally different approach. Mem0 and OpenMemory store individual facts as vector embeddings in a database (Qdrant, PostgreSQL). You need Docker, a running server, and API calls to retrieve memories. They're designed for building AI apps, not for humans.
Gyrus produces human-readable wiki pages that you can open in any text editor. Knowledge isn't scattered across database rows — it's structured, narrative documents that an LLM iteratively refines. Each merge pass makes pages deeper, not just longer.
TL;DR: They give you a memory API. We give you a knowledge base you can actually read.
Those are static files you write and maintain by hand, scoped to a single repo. They tell the AI how to behave in that codebase. Gyrus automatically extracts and maintains knowledge from your actual work across all your projects and tools.
AGENTS.md = instructions you write. Gyrus = knowledge that writes itself.
Gyrus currently supports Claude Code, Claude Cowork, Codex, Antigravity, and Cursor. More tools added on request — each one just needs a session reader (~30 lines of Python). Contributions welcome.
Gyrus runs entirely on your machine by default. Sessions are sent to your chosen LLM API for extraction (same as using any AI tool), but your knowledge base stays local as plain markdown files. Nothing is stored in any cloud unless you opt in — you can choose to sync via iCloud, Dropbox, Notion, or Git if you want cross-machine access.
Yes. They're markdown files in ~/.gyrus/projects/. Edit them with any text editor. Gyrus will preserve your manual edits during the next merge pass.
Three commands: remove the cron job (crontab -l | grep -v ingest.py | crontab -), remove the Claude Code skill (rm ~/.claude/commands/gyrus.md), and delete the directory (rm -rf ~/.gyrus). Back up your knowledge base first if you want to keep it.
Give your AI tools the context they've been missing.
curl -fsSL https://gyrus.sh/install | bash