Lume Svelte Themes

Lume

A locally-first AI chat desktop app for Linux.

šŸš€ Lume — Local AI Desktop Client

Lume is a high-performance, privacy-first AI desktop chat application for Linux. It provides a clean, modern interface for your local Ollama models — no cloud, no telemetry, no compromise.

Built with Tauri v2, Svelte 5, and Rust, Lume runs blisteringly fast on low-end hardware while delivering a premium user experience that rivals ChatGPT and Claude's desktop apps.


✨ Features

Core

  • Zero-Latency Conversations — Connects directly to Ollama on localhost:11434 for fully offline inference
  • Real-Time Streaming — Watch tokens appear word-by-word as the CPU/GPU generates them
  • Multi-Session Chat — Create, switch between, search, and delete independent chat sessions
  • Persistent History — SQLite backend with foreign-key cascading ensures conversations survive restarts
  • Model Selection — Switch between any installed Ollama model on the fly

Interface

  • Collapsible Sidebar — Premium sidebar with session list, search bar, and smooth 300ms collapse animation
  • Smart Titles — Session titles auto-generate from your first message (truncated to 35 chars)
  • Time-Relative Stamps — Sidebar shows "Just now", "2h ago", "Yesterday" etc.
  • Dark / Light Mode — Native system-preference detection with manual toggle
  • Markdown & Syntax Highlighting — Full GFM rendering with highlight.js code blocks
  • Scroll-to-Bottom FAB — Floating action button with unread badge and bounce animation

    Interaction & Details

  • Thinking Process Parsing — Natively extracts <think> reasoning metadata and renders inline as a collapsible component (for DeepSeek/Gemma reasoning models).
  • Edit & Regenerate — Modify previous prompts to branch out chats, or easily regenerate an AI output with one click.
  • Stop Generation — Mid-stream capability to instantly abort API outputs, saving CPU tokens.
  • Copy Message — One-click text copying to your system clipboard.
  • Model Analytics — Subtly displays Generation Time (seconds) and Token Count underneath model responses.

Settings & Personalization

  • Settings Panel — Dedicated modal with 5 distinct configuration tabs (Models, Chat, Appearance, Data, About)
  • Granular Control — Modify AI temperatures and assign specific models to unique chats
  • Agent Personalities — Built-in selectable personas (Coder, Writer, Pirate, Analyst, Assistant)
  • Data Portability — Easily Export and Import complete markdown/JSON chat histories natively
  • Side Panel Upgrades — Pin favorite chats to the top, toggle bulk deletion modes, and view user profiles dynamically

Performance

  • ~5 MB binary size
  • < 0.5s startup time
  • < 20 MB idle RAM usage
  • No Electron — Native WebKit via Tauri

šŸ“ø Screenshots

UI Overhaul Complete - Fresh Interface Screenshots Updating Soon


šŸ› ļø Tech Stack

Layer Technology
Framework Tauri v2 (Rust backend)
Frontend Svelte 5 with $state / $derived reactivity
Styling Tailwind CSS v3 + @tailwindcss/typography
Database SQLite via rusqlite
Markdown marked + marked-highlight + DOMPurify
Syntax highlight.js (github-dark-dimmed theme)
Inference Ollama REST API

šŸ’» System Requirements

Resource Minimum
OS Fedora 40+ / any RPM-based Linux distro
CPU Intel Core i5 or equivalent
RAM 8 GB (Lume itself uses < 20 MB)
Disk ~10 MB for Lume + Ollama model weights
GPU Not required (Ollama handles GPU acceleration)

šŸ“¦ Installation

Pre-built RPM (Fedora)

sudo dnf install ./lume-0.5.0-1.x86_64.rpm

From Source

Prerequisites: Node.js 18+, Rust 1.70+, and Tauri system dependencies:

sudo dnf install webkit2gtk4.1-devel openssl-devel curl wget file \
  libappindicator-gtk3-devel librsvg2-devel pango-devel

Build steps:

git clone https://github.com/sandeep4513m/Lume.git
cd Lume
npm install
npm run tauri dev      # Development mode with hot-reload
npm run tauri build    # Production RPM bundle

Note: Ensure Ollama is installed and running before launching Lume:

systemctl start ollama

šŸ—ƒļø Project Structure

lume/
ā”œā”€ā”€ src/                    # Svelte frontend
│   ā”œā”€ā”€ routes/+page.svelte # Main chat + sidebar UI
│   ā”œā”€ā”€ components/         # Markdown renderer
│   └── lib/ollama.js       # Ollama API client
ā”œā”€ā”€ src-tauri/              # Rust backend
│   ā”œā”€ā”€ src/db.rs           # SQLite schema + Tauri commands
│   └── src/lib.rs          # App bootstrap + command registration
ā”œā”€ā”€ package.json
└── README.md

šŸ¤ Contributing

Contributions are welcome! Feel free to open issues or submit pull requests.

šŸ“„ License

This project is licensed under the MIT License. See the LICENSE file for details.


Built with šŸ’š using Tauri + Svelte + Ollama

Top categories

Loading Svelte Themes