Operational memory for AI coding.
Your codebase tells the what. Pulse remembers the why.
Pulse is a self-hosted knowledge base that fills itself. It watches your AI coding sessions, captures decisions, dead-ends, and patterns — then makes them searchable for you and your AI agents.
Why? Every day, developers make dozens of important decisions during AI coding sessions. Why you chose this approach over that one. What you tried that didn't work. The business constraint that drove a technical choice. All of this disappears when the chat window closes. Pulse captures it automatically.
pulse watch, pulse search, pulse reflect — works without VS Codecurl -fsSL https://raw.githubusercontent.com/glieai/pulse-ai/main/docker-compose.yml -o docker-compose.yml
docker compose up -d
Pulse is running at http://localhost:5173 — no login required in solo mode.
Optionally copy .env.example to .env to customize settings (default values work out of the box).
npx @glie/pulse-cli init # one-time setup: config + MCP registration
npx @glie/pulse-cli setup-mcp # run in each project folder to enable MCP there
The init creates your config (~/.pulse/config.json) and registers the MCP server globally. Then run setup-mcp in each project folder where you want MCP available — or just run pulse watch which does both.
Optional: Install the Pulse AI VS Code extension for a sidebar with drafts, search, watcher controls, and CodeLens annotations. The extension reads the config created by the CLI — no extra setup needed.
Open Claude Code or Codex in any configured project. Your AI agents query the knowledge base automatically via MCP. Run pulse watch to auto-generate insights from commits and AI sessions.
You code with AI → Pulse watches → LLM generates insight → Draft saved locally
↓
You review & publish
↓
Searchable by you + AI agents
| Kind | What it captures |
|---|---|
decision |
Technical choices with alternatives considered |
dead_end |
Approaches that failed and why |
pattern |
Reusable knowledge and conventions |
context |
Background information |
progress |
Milestones and deliverables |
business |
Domain constraints that drove technical decisions |
/api — Hono + Bun (TypeScript, sub-10ms)
/web — SvelteKit + Tailwind
/cli — CLI tool (Bun)
/extension — VS Code extension
/mcp — MCP server (Model Context Protocol)
/shared — Shared types and utils
Stack: PostgreSQL 17 + pgvector (HNSW) · Hono · Bun · SvelteKit · Tailwind
Copy .env.example to .env and configure:
cp .env.example .env
| Variable | Required | Description |
|---|---|---|
DATABASE_URL |
Yes | PostgreSQL connection string |
JWT_SECRET |
Yes | Random 32+ character string |
ANTHROPIC_API_KEY |
No | For AI-powered insight generation |
OPENAI_API_KEY |
No | Alternative LLM provider |
# Install dependencies
bun install
# Start database
docker compose -f docker-compose.dev.yml up -d
# Setup
cp .env.example .env
# Run migrations
bun run db:migrate
# Start API (terminal 1)
cd api && bun run dev
# Start Web (terminal 2)
cd web && bun run dev
When connected via MCP, your AI agent has access to:
| Tool | Description |
|---|---|
pulse_search |
Search the knowledge base |
pulse_context |
Get relevant context for a topic |
pulse_file_context |
Get insights related to a file |
pulse_create |
Record a new insight |
pulse_publish |
Publish draft insights |
pulse_summary |
Get knowledge base overview |
pulse_generate |
Generate insight from raw data |
Contributions are welcome. Please open an issue first to discuss what you'd like to change.
# Fork the repo, create a branch
git checkout -b feature/your-feature
# Make changes, run checks
bun run lint
bun run format
# Open a PR
Apache 2.0 — see LICENSE.
Built by Glie.