A modern, lightweight Electron application that provides a toolbox-style AI chat interface with support for multiple AI providers (Ollama, OpenAI, Anthropic) and MCP (Model Context Protocol) tools.
Super+Space
(or Ctrl+Alt+Space
as fallback)~/.canvas/electron/agents/<agent-name>/
%APPDATA%/Canvas/electron/agents/<agent-name>/
YYYY-MM-DDTHH-mm-ss-conversation.json
src/
āāā main/ # Electron main process (Node.js)
ā āāā main.ts # Application entry point
ā āāā tray.ts # System tray management
ā āāā toolbox.ts # Toolbox window management
ā āāā settings.ts # Settings window management
ā āāā services/ # Business logic services
āāā renderer/ # Frontend (React + TypeScript)
ā āāā components/ # UI components (shadcn/ui)
ā āāā pages/ # Application pages
ā āāā lib/ # Utilities and helpers
āāā shared/ # Shared types and constants
āāā types.ts # TypeScript interfaces
āāā constants.ts # Application constants
# Clone the repository
git clone <repository-url>
cd canvas
# Install dependencies
npm install
# Development mode
npm run dev
# Build for production
npm run build
# Package the application
npm run package
npm run dev
- Start both main and renderer in development modenpm run dev:main
- Build and run main process onlynpm run dev:renderer
- Start Vite dev server for renderernpm run build
- Build both main and renderer for productionnpm run build:main
- Build main process onlynpm run build:renderer
- Build renderer onlynpm run lint
- Run ESLint on all TypeScript filesnpm run type-check
- Run TypeScript type checkingThe application follows Electron best practices with clear separation between:
Each agent can be configured with:
ollama
, openai
, or anthropic
llama3.2
, gpt-4
, claude-3-sonnet
){
"name": "Default Agent",
"systemPrompt": "You are a helpful AI assistant.",
"runtime": "ollama",
"apiUrl": "http://localhost:11434",
"apiToken": "",
"model": "llama3.2",
"temperature": 0.7,
"topP": 0.9,
"maxTokens": 2048,
"mcpTools": []
}
# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh
# Pull a model
ollama pull llama3.2
# Default endpoint: http://localhost:11434
// Configuration
{
"runtime": "openai",
"apiUrl": "https://api.openai.com/v1",
"apiToken": "sk-...",
"model": "gpt-4"
}
// Configuration
{
"runtime": "anthropic",
"apiUrl": "https://api.anthropic.com",
"apiToken": "sk-ant-...",
"model": "claude-3-sonnet-20240229"
}
The application includes support for MCP tools, allowing AI agents to use external tools and services. MCP tools can be:
Super+Space
- Toggle toolbox windowCtrl+Alt+Space
- Alternative toolbox toggleEnter
- Send message in chatShift+Enter
- New line in message inputnpm run build
npm run package
This creates platform-specific packages in the release/
directory:
Run with debug flags to see detailed logs:
DEBUG=canvas* npm start
F12
in any window to open Chrome DevToolsAGPL-3.0-or-later - see LICENSE file for details.