Install one binary. Now your scripts have AI memory. Your cron jobs reason. Your Telegram bot thinks.
A private AI backend for everything on your machine — native desktop app, plugins in any language, and an API your curl can call. Just Rust.
https://zenii.sprklai.com
"ChatGPT is a tab you open. Zenii is a capability your machine gains."
| Your pain | How Zenii fixes it |
|---|---|
| Context resets every AI session | Semantic memory persists across sessions and survives restarts |
| AI can't do things, only talk | 16 built-in tools: web search, file ops, shell, scheduling |
| Locked into one AI provider | 18 providers, switch with one config change |
| AI tools are cloud-only | 100% local, zero telemetry, OS keyring for secrets |
| "Works on my machine" for AI | Same binary on macOS, Linux, Windows -- desktop, CLI, or daemon |
| Plugin systems require learning a framework | JSON-RPC over stdio -- write plugins in Python, Go, JS, or anything |
| AI doesn't learn your patterns | Self-evolving skills with human-in-the-loop approval |
| AI can't run tasks while you sleep | Built-in cron scheduler for autonomous recurring tasks |
curl| Zenii | OpenClaw | ZeroClaw | |
|---|---|---|---|
| Category | AI backend | Chat agent | Minimal daemon |
| Language | Rust | TypeScript | Rust |
| Binary | <20 MB (w/ GUI) | ~100 MB+ | ~3.4 MB |
| Desktop GUI | Native (Tauri 2) | -- | -- |
| API Routes | 96 REST+WS | Chat endpoint | Daemon endpoint |
| Plugins | Any language | JS only | Rust only |
| Memory | FTS5 + vectors | File-based | Basic |
| Self-Evolution | Human-approved | Autonomous | -- |
| Scheduling | Cron + one-shot | Cron | -- |
| Security | 6 layers default | Optional sandbox | Privacy claims |
| License | MIT | Open source | Open source |
No other project has ALL of these simultaneously: native desktop GUI, 96-route REST/WS API, plugins in any language, semantic vector memory, self-evolution with human approval, under 20 MB compiled binary, and MIT licensed.
agent_notes) for agent-writable behavioral rules by category, stored in DB and auto-injected into context; SkillProposalTool for human-in-the-loop skill evolutionsupports_tools pre-check prevents tool-calling errors with incompatible modelszenii setup interactive flow), and TUI (4-step overlay modal) collecting AI provider selection, API key, default model, and user profile (name, location, timezone)/api-docs + OpenAPI 3.1 JSON spec (feature-gated api-docs, built with utoipa)| Layer | Technology |
|---|---|
| Language | Rust 2024 edition |
| Async | Tokio |
| AI | rig-core |
| Database | rusqlite + sqlite-vec |
| Gateway | axum (HTTP + WebSocket) |
| Frontend | Svelte 5 + SvelteKit + shadcn-svelte + Tailwind CSS |
| Desktop | Tauri 2 |
| CLI | clap |
| Plugins | JSON-RPC 2.0 external processes |
| Channels | Telegram (teloxide), Slack, Discord (serenity) -- feature-gated |
| Content | serde_yaml (YAML frontmatter parsing) |
| i18n | paraglide-js (compile-time, tree-shakeable) |
| Mobile | Tauri 2 (iOS + Android) -- future release |
| TUI | ratatui |
graph TD
subgraph Clients["Clients"]
Desktop[Desktop] & Mobile["Mobile<br>#40;future#41;"] & CLI[CLI] & TUI[TUI] & Daemon[Daemon]
Web["Frontend<br>Svelte 5"]
end
subgraph Core["zenii-core"]
BootEntry["boot.rs<br>init_services"]
subgraph App["Application Layer"]
Gateway["Gateway<br>axum :18981"]
AI["AI Engine<br>rig-core"]
Context["Context Engine<br>3-tier injection"]
DB["Database<br>rusqlite + sqlite-vec"]
end
subgraph Domain["Domain Layer"]
Identity["Identity<br>SoulLoader"]
Skills["Skills<br>SkillRegistry"]
UserL["User Profile<br>UserLearner"]
Channels["Channels"]
PluginReg["Plugins<br>PluginRegistry"]
end
subgraph Support["Support Layer"]
Tools["Agent Tools"]
Security["Security"]
Creds2["Credentials"]
Config["Config"]
EventBus["EventBus"]
end
end
Desktop -->|embedded gateway| Gateway
Mobile & CLI & TUI & Daemon --> Gateway
Web -->|HTTP/WS| Gateway
BootEntry --> Gateway & DB & EventBus
Gateway --> AI & DB & Context
Gateway --> Identity & Skills & UserL & Channels & PluginReg
AI --> Tools & Security & DB
AI --> Identity & Skills
Context --> DB & Identity & UserL & Skills
style Clients fill:#2196F3,color:#fff
style App fill:#4CAF50,color:#fff
style Domain fill:#FF9800,color:#fff
style Support fill:#9E9E9E,color:#fff
graph TD
desktop[zenii-desktop] --> core[zenii-core]
mobile["zenii-mobile<br>#40;future#41;"] -.-> core
cli[zenii-cli]
cli --> reqwest["reqwest<br>#40;HTTP client#41;"]
cli --> tungstenite["tokio-tungstenite<br>#40;WS#41;"]
tui[zenii-tui] --> core
daemon[zenii-daemon] --> core
core --> axum["axum<br>#40;gateway#41;"]
core --> rusqlite["rusqlite<br>#40;database#41;"]
core --> rigcore["rig-core<br>#40;AI#41;"]
core --> tokio["tokio<br>#40;async#41;"]
core --> keyring["keyring<br>#40;credentials#41;"]
core --> serdeyaml["serde_yaml<br>#40;YAML frontmatter#41;"]
core -.-> teloxide["teloxide<br>#40;Telegram, feature-gated#41;"]
core -.-> serenity["serenity<br>#40;Discord, feature-gated#41;"]
sequenceDiagram
participant U as User
participant G as Gateway (axum)
participant AI as AI Engine (rig-core)
participant M as Memory (sqlite-vec)
participant LLM as LLM Provider
participant T as Tools
U->>G: Send message (REST/WS)
G->>M: Query relevant context
M-->>G: Context results
G->>AI: Dispatch with context + tools
AI->>LLM: Stream prompt
loop Tool calling loop
LLM-->>AI: Response (may include tool calls)
alt Tool call detected
AI->>T: Execute tool
T-->>AI: Tool result
AI->>LLM: Feed result back
end
end
LLM-->>AI: Final response
AI-->>G: Stream tokens
G-->>U: Stream via WS
G->>M: Store conversation
sequenceDiagram
participant App as Application
participant Cfg as Config
participant DB as SQLite
participant Cred as Keyring
participant AI as AI Providers
participant Ctx as Context Engine
participant Plug as Plugins
participant GW as Gateway
App->>Cfg: Parse CLI args + load TOML
App->>App: Initialize tracing
App->>DB: Open/create database + migrations
App->>Cred: Initialize credential store
App->>AI: Register providers + load API keys
App->>AI: Register 14 base + 2 feature-gated agent tools
App->>Ctx: Init ContextEngine + BootContext (OS, location, timezone)
App->>Plug: Scan plugins directory + register tools/skills
App->>GW: Start axum server (:18981)
alt Desktop
App->>App: Open Tauri window
else CLI
App->>App: Enter REPL loop
else TUI
App->>App: Render ratatui UI
else Daemon
App->>App: Wait for connections
end
sequenceDiagram
participant C as Client
participant S as Server
C->>S: WS Connect /ws/chat
C->>S: { type: "chat", content: "hello" }
Note over S: Query memory + prompt + LLM
S-->>C: { type: "token", content: "Hi" }
S-->>C: { type: "token", content: " there" }
S-->>C: { type: "tool_call", name: "websearch" }
S-->>C: { type: "tool_result", result: "..." }
S-->>C: { type: "done" }
graph TD
Daemon[zenii-daemon] --> Default[default]
Daemon --> Ch["--features channels"]
Daemon --> Sc["--features scheduler"]
Daemon --> Wd["--features web-dashboard"]
Default --> GW["zenii-core/gateway"]
GW --> Axum[axum + tower-http]
Ch --> ChCore[zenii-core/channels]
Sc --> ScCore[zenii-core/scheduler]
Wd --> WdCore[zenii-core/web-dashboard]
WdCore --> GW
zenii/
├── Cargo.toml # Workspace root (5 members)
├── CLAUDE.md # AI assistant instructions
├── README.md # This file
├── scripts/
│ └── build.sh # Cross-platform build script
├── docs/
│ ├── architecture.md # Detailed architecture diagrams
│ ├── processes.md # Process flow diagrams
│ ├── api-reference.md # All 96 REST/WS routes
│ ├── configuration.md # All 70+ config fields
│ ├── cli-reference.md # CLI command reference
│ ├── deployment.md # Deployment guide
│ └── development.md # Development guide
├── crates/
│ ├── zenii-core/ # Shared library (NO Tauri dependency)
│ ├── zenii-desktop/ # Tauri 2.10 shell (macOS, Windows, Linux)
│ ├── zenii-mobile/ # Tauri 2 shell (iOS, Android) (future release)
│ ├── zenii-cli/ # clap CLI
│ ├── zenii-tui/ # ratatui TUI
│ └── zenii-daemon/ # Headless daemon
└── web/ # Svelte 5 SPA frontend (shared by desktop + mobile)
Linux (Debian/Ubuntu):
sudo apt install libsqlite3-dev libwebkit2gtk-4.1-dev libappindicator3-dev \
librsvg2-dev patchelf libssl-dev
macOS:
brew install sqlite3
Windows:
# SQLite is bundled via rusqlite's "bundled" feature -- no extra install needed
# Check everything compiles
cargo check --workspace
# Run tests
cargo test --workspace
# Lint
cargo clippy --workspace
# Start the daemon
cargo run -p zenii-daemon
# Start the CLI
cargo run -p zenii-cli -- chat
# Start the TUI
cargo run -p zenii-tui
# Start the desktop app (dev mode with hot reload)
cd crates/zenii-desktop && cargo tauri dev
# Start the desktop app connecting to external daemon
ZENII_GATEWAY_URL=http://localhost:18981 cd crates/zenii-desktop && cargo tauri dev
# Frontend dev server (hot reload)
cd web && bun run dev
./scripts/build.sh --target native # Debug build
./scripts/build.sh --target native --release # Release (optimized, smallest binary)
./scripts/build.sh --target native --release --crates "zenii-daemon zenii-cli" # Specific crates only
./scripts/build.sh --target native --release --all-features # With all features
Output goes to dist/native/release/.
./scripts/build.sh --tauri --release # Release bundle (.deb/.AppImage, .dmg, .msi)
./scripts/build.sh --tauri --release --bundle deb,appimage # Specific bundle formats
./scripts/build.sh --dev # Dev mode (Vite + Tauri hot reload)
./scripts/build.sh --list-targets # Show all available targets
# Linux targets
./scripts/build.sh --target linux-x86 --release --install-toolchain
./scripts/build.sh --target linux-arm64 --release --install-toolchain
./scripts/build.sh --target linux-armv7 --release --install-toolchain # Raspberry Pi
./scripts/build.sh --target linux-musl --release --install-toolchain # Static binary
# macOS (must run on macOS)
./scripts/build.sh --target macos-x86 --release # Intel
./scripts/build.sh --target macos-arm --release # Apple Silicon
./scripts/build.sh --target macos-universal --release # Universal (x86_64 + ARM via lipo)
# Windows (from Linux)
./scripts/build.sh --target windows --release --install-toolchain
# All targets at once
./scripts/build.sh --target all --release --install-toolchain
Cross-compilation prerequisites (Linux):
sudo apt install gcc-aarch64-linux-gnu # ARM64
sudo apt install gcc-arm-linux-gnueabihf # ARMv7
sudo apt install gcc-mingw-w64-x86-64 # Windows
./scripts/build.sh --target linux-arm64 --release --docker
./scripts/build.sh --target windows --release --docker
| Profile | Flag | Use Case |
|---|---|---|
debug |
(default) | Development |
release |
--release |
Production (full LTO, smallest binary) |
ci-release |
--profile ci-release |
CI builds (thin LTO, faster compile) |
release-fast |
--profile release-fast |
Profiling (thin LTO + debug info) |
Note: Tauri desktop builds cannot cross-compile -- each platform must build on its native OS. Use the GitHub Actions CI workflow for automated multi-platform Tauri builds.
See scripts/build.sh for full options.
cargo build -p zenii-daemon # Core only (gateway + ai + keyring)
cargo build -p zenii-daemon --features local-embeddings # + local FastEmbed ONNX embeddings
cargo build -p zenii-daemon --features channels # + channel core traits + registry
cargo build -p zenii-daemon --features channels-telegram # + Telegram (teloxide)
cargo build -p zenii-daemon --features channels-slack # + Slack
cargo build -p zenii-daemon --features channels-discord # + Discord (serenity)
cargo build -p zenii-daemon --features scheduler # + cron jobs
cargo build -p zenii-daemon --features api-docs # + Scalar UI + OpenAPI spec at /api-docs
cargo build -p zenii-daemon --features web-dashboard # + embedded web UI
cargo build -p zenii-daemon --all-features # Everything
cargo test --workspace # All tests
cargo test -p zenii-core # Core only
cargo test -p zenii-core -- memory # Memory module
cargo test -p zenii-core -- db # Database module
cd web && bun run test # Frontend tests
Zenii uses a TOML configuration file. Paths are resolved via directories::ProjectDirs::from("com", "sprklai", "zenii"):
| OS | Config File | Database File |
|---|---|---|
| Linux | ~/.config/zenii/config.toml |
~/.local/share/zenii/zenii.db |
| macOS | ~/Library/Application Support/com.sprklai.zenii/config.toml |
~/Library/Application Support/com.sprklai.zenii/zenii.db |
| Windows | %APPDATA%\sprklai\zenii\config\config.toml |
%APPDATA%\sprklai\zenii\data\zenii.db |
Example config.toml (flat structure, all fields optional with defaults):
gateway_host = "127.0.0.1"
gateway_port = 18981
log_level = "info"
# data_dir = "/custom/data/path" # Override default data directory
# db_path = "/custom/path/zenii.db" # Override database file path
identity_name = "Zenii"
identity_description = "AI-powered assistant"
default_provider = "anthropic"
default_model = "claude-sonnet-4-6"
security_autonomy_level = "supervised" # supervised | autonomous | strict
max_tool_retries = 3
# gateway_auth_token = "your-secret-token" # Optional bearer token for auth
# agent_max_turns = 4 # Max tool-calling turns per request
# agent_max_continuations = 1 # Max autonomous reasoning turns
# tool_dedup_enabled = true # Deduplicate identical tool calls per request
# embedding_provider = "none" # none | openai | local
# user_name = "John" # Display name for greetings
# user_timezone = "America/New_York" # IANA timezone (auto-detected on first run)
# user_location = "New York, US" # User location for context-aware queries
# plugins_dir = "/custom/plugins/path" # Override default plugins directory
# plugin_auto_update = false # Auto-update git-sourced plugins
zenii setup # First-run onboarding wizard (provider, API key, model, profile)
zenii daemon start|stop|status # Manage the daemon process
zenii chat [--session ID] [--model M] # Interactive WS streaming chat
zenii run "prompt" [--session] [--model] # Single prompt, print response
zenii memory search "query" [--limit N] [--offset N] # Search memories
zenii memory add <key> <content> # Add memory entry
zenii memory remove <key> # Remove memory entry
zenii config show # Show current config
zenii config set <key> <value> # Set a config value
zenii key set <provider> <key> # Set API key
zenii key remove <provider> # Remove API key
zenii key list # List stored keys
zenii provider list # List AI providers
zenii provider test <id> # Test provider connection
zenii provider add <id> <name> <base_url> # Add custom provider
zenii provider remove <id> # Remove user-defined provider
zenii provider default <provider> <model> # Set default model
zenii embedding activate <provider> # Activate embeddings (openai/local)
zenii embedding deactivate # Deactivate embeddings
zenii embedding status # Show embedding provider status
zenii embedding test # Test embedding generation
zenii embedding reindex # Re-embed all memories
zenii plugin list # List installed plugins
zenii plugin install <source> [--local] # Install from git URL or local path
zenii plugin remove <name> # Remove a plugin
zenii plugin update <name> # Update a plugin
zenii plugin enable <name> # Enable a plugin
zenii plugin disable <name> # Disable a plugin
zenii plugin info <name> # Show plugin details
Global options: --host, --port, --token (or ZENII_TOKEN env var)
| Group | Routes | Description |
|---|---|---|
| Health | GET /health |
Health check (no auth) |
| Sessions & Chat | POST /sessions, GET /sessions, GET/PUT/DELETE /sessions/{id}, POST /sessions/{id}/generate-title, GET/POST /sessions/{id}/messages, POST /chat |
Chat sessions and messaging |
| Memory | POST /memory, GET /memory, GET/PUT/DELETE /memory/{key} |
Semantic memory CRUD |
| Config | GET /config, PUT /config, GET /config/file |
Configuration management |
| Setup | GET /setup/status |
First-run onboarding status |
| Credentials | POST/GET /credentials, DELETE /credentials/{key}, GET /credentials/{key}/value, GET /credentials/{key}/exists |
Credential management (keyring) |
| Providers & Models | GET/POST /providers, GET /providers/with-key-status, GET/PUT /providers/default, GET/PUT/DELETE /providers/{id}, POST /providers/{id}/test, POST /providers/{id}/models, DELETE /providers/{id}/models/{model_id}, GET /models |
Multi-provider AI management |
| Tools | GET /tools, POST /tools/{name}/execute |
Tool listing and execution |
| Permissions | GET /permissions, GET /permissions/{surface}, PUT/DELETE /permissions/{surface}/{tool} |
Per-surface tool permissions |
| System | GET /system/info |
System information |
| Identity | GET /identity, GET/PUT /identity/{name}, POST /identity/reload |
Persona management |
| Skills | GET /skills, GET/PUT/DELETE /skills/{id}, POST /skills, POST /skills/reload |
Skill CRUD |
| Skill Proposals | GET /skills/proposals, POST /skills/proposals/{id}/approve, POST /skills/proposals/{id}/reject, DELETE /skills/proposals/{id} |
Self-evolving skill management |
| User | GET/POST/DELETE /user/observations, GET/DELETE /user/observations/{key}, GET /user/profile |
User learning + privacy |
| Embeddings | GET /embeddings/status, POST /embeddings/test, POST /embeddings/embed, POST /embeddings/download, POST /embeddings/reindex |
Semantic memory embedding management |
| Plugins | GET /plugins, POST /plugins/install, GET/DELETE /plugins/{name}, PUT /plugins/{name}/toggle, POST /plugins/{name}/update, GET/PUT /plugins/{name}/config |
Plugin management (install, remove, enable/disable, config) |
| Channels | POST /channels/{name}/test (always); GET /channels, GET /channels/{name}/status, POST /channels/{name}/send, POST /channels/{name}/connect, POST /channels/{name}/disconnect, GET /channels/{name}/health, POST /channels/{name}/message, GET /channels/sessions, GET /channels/sessions/{id}/messages (feature-gated) |
Messaging channels |
| Scheduler | GET/POST /scheduler/jobs, PUT /scheduler/jobs/{id}/toggle, DELETE /scheduler/jobs/{id}, GET /scheduler/jobs/{id}/history, GET /scheduler/status (feature-gated) |
Cron job management |
| WebSocket | GET /ws/chat, GET /ws/notifications |
Streaming chat + notification push |
| API Docs | GET /api-docs, GET /api-docs/openapi.json |
Interactive Scalar UI + OpenAPI 3.1 spec (feature-gated: api-docs) |
Detailed documentation lives in the docs/ directory:
See CONTRIBUTING.md for detailed guidelines. Quick summary:
git checkout -b feature/my-featurecargo test --workspace and cargo clippy --workspace -- -D warnings passMIT