Application desktop multi-agent avec interface conversationnelle intelligente.
Developed by Assistance Micro Design
Built with Claude Code by Anthropic
This software is currently in beta (v0.14.0).
Before using Zileo Chat, please be aware of the following risks:
| Risk | Description |
|---|---|
| Data Loss | Database schema may change between versions, potentially requiring data migration or reset |
| API Costs | LLM API calls to Mistral AI incur costs based on token usage - monitor your usage |
| Instability | Features may be incomplete, contain bugs, or change without notice |
| Security | While security measures are implemented, the software has not undergone formal security audit |
| Breaking Changes | Updates may introduce breaking changes to configurations or workflows |
Recommendation: Back up your data regularly and avoid using for critical production tasks until v1.0 release.
Zileo Chat is a desktop application for orchestrating AI agents through a conversational interface. It supports multi-agent workflows with tool execution, memory persistence, and human-in-the-loop validation.
Zileo Chat currently supports two LLM providers:
| Provider | Type | Link |
|---|---|---|
| Mistral AI | Cloud API | https://mistral.ai |
| Ollama | Local | https://ollama.com |
| Dependency | Purpose | Installation |
|---|---|---|
| Docker Desktop | MCP servers execution | docker.com/products/docker-desktop |
| Mistral API Key | Cloud LLM provider | console.mistral.ai |
Mistral API vs Le Chat Pro: The Le Chat subscription ($14.99/month) is for the web chat interface only. Zileo Chat requires a separate API key from La Plateforme with pay-per-token billing.
Use Docker MCP Toolkit for MCP server management:
Recommended: Always prefer Docker configurations over NPX/UVX for better isolation and zero dependency management.
| Step | Command |
|---|---|
| Install Ollama | ollama.com/download |
| Run cloud model | ollama run kimi-k2-thinking:cloud |
| List available models | ollama list |
Ollama Cloud: For large models like Kimi K2 (1T params), use
ollama run <model>:cloud. No local GPU required.
node --version # >= 20.19
rustc --version # >= 1.80.1
# Clone repository
git clone https://github.com/assistance-micro-design/zileo-chat.git
cd zileo-chat
# Install dependencies
npm install
# Development
npm run tauri:dev
# Production build
npm run tauri:build
| Layer | Technology |
|---|---|
| Frontend | SvelteKit 2.49 + Svelte 5 |
| Backend | Rust + Tauri 2.9 |
| Database | SurrealDB 2.4 (embedded) |
| LLM | Rig.rs 0.30 |
Full documentation is available in the docs/ directory:
git checkout -b feature/your-feature)git commit -m 'Add feature')git push origin feature/your-feature)To report a vulnerability, please open a private issue on GitHub Security.
This project is licensed under the Apache License 2.0. See LICENSE for details.
Third-party licenses are documented in THIRD_PARTY_LICENSES.md.
Copyright 2025 Assistance Micro Design
Licensed under the Apache License, Version 2.0