zenii Svelte Themes

Zenii

Your machine's AI brain. One 20MB binary gives every tool, script, and cron job shared AI memory + 133 API routes. Desktop app, CLI, Telegram — all connected. Rust-powered.

Zenii (zen-ee-eye)

Zenii demo

One local AI backend for your scripts, tools, and agents.

Zenii runs a daemon on http://localhost:18981 so your desktop app, CLI, TUI, scripts, and MCP clients all use the same memory, tools, model providers, and permissions.

Latest release CI MIT license PRs welcome

Zenii is for developers who want AI to behave like infrastructure instead of a browser tab. Run one local service. Call it from curl, scripts, cron jobs, or an MCP client. Use the same backend from the native desktop app, CLI, or TUI.

See It Work

curl -fsSL https://raw.githubusercontent.com/sprklai/zenii/main/install.sh | bash
zenii-daemon &

# Store something once
curl -s -X POST http://localhost:18981/memory \
  -H "Content-Type: application/json" \
  -d '{"key":"deploy","content":"Production database is on port 5434"}' >/dev/null

# Ask through chat later
curl -s -X POST http://localhost:18981/chat \
  -H "Content-Type: application/json" \
  -d '{"session_id":"ops","prompt":"What port is the production database on?"}' | jq -r '.response'

That is the core value: write state once, use it from anywhere that talks to Zenii.

What Zenii Is

  • A local daemon with a REST and WebSocket API at localhost:18981
  • A shared AI backend for the desktop app, CLI, TUI, scripts, and MCP clients
  • Persistent memory, provider routing, and tool execution in one local service
  • A native Rust/Tauri stack instead of an Electron wrapper

Good Fit

  • Local automations that need shared memory across scripts, bots, and tools
  • Developer tooling that wants one AI backend behind HTTP or MCP
  • Self-hosted workflows where privacy and local control matter
  • Projects that want a desktop UI and a scriptable backend without maintaining both separately

Current Product Boundaries

  • Zenii is not a hosted SaaS product
  • Zenii is not a drop-in OpenAI-compatible server today
  • Mobile is planned, but not shipped in this repository

What Ships Today

  • zenii-daemon: local API server
  • zenii: CLI client
  • zenii-tui: terminal UI
  • zenii-desktop: Tauri desktop app
  • zenii-mcp-server: MCP server for Claude Code, Cursor, and similar clients
  • 18 tools total (15 base + 3 feature-gated: channels, scheduler, workflows)
  • 133 total API routes: 105 base routes and 28 feature-gated routes
  • LLM wiki with binary document ingestion (PDF, DOCX, PPTX, XLSX, images via MarkItDown)
  • MIT license

Install

Use the install script on Linux or macOS:

curl -fsSL https://raw.githubusercontent.com/sprklai/zenii/main/install.sh | bash
zenii-daemon &

Or download platform binaries and desktop packages from GitHub Releases.

Full platform notes, package names, and source builds:

Interfaces

Surface Best for
zenii-daemon Local API server for scripts, automations, and services
zenii Quick prompts, shell pipelines, and terminal workflows
zenii-tui Terminal-native interactive use
zenii-desktop Native desktop UI on top of the same backend
zenii-mcp-server Exposing Zenii tools to external coding agents

MCP Example

Add Zenii to .mcp.json:

{
  "mcpServers": {
    "zenii": {
      "command": "zenii-mcp-server",
      "args": ["--transport", "stdio"]
    }
  }
}

More integration detail lives in AGENT.md.

Docs

Contributing

Small documentation fixes, typo fixes, tests, and focused bug fixes can go straight to a PR. Larger feature work should start with CONTRIBUTING.md.

If Zenii is useful to you, star the repo: https://github.com/sprklai/zenii

License

MIT

Top categories

Loading Svelte Themes