ai_tools Svelte Themes

Ai_tools

AI Tools is a tester and companion web application designed to help you manage, test, and interact with local LLM infrastructure. It provides a comprehensive web UI for working with llama.cpp servers, vector databases, implment you own personal AI agent, and various data conversion tools.

AI Tools

+

šŸ¦€ A comprehensive web application for managing and testing LLM workflows with llama.cpp, vector databases, and AI agents.

Version License: MIT

Platforms

Overview

AI Tools is a tester and companion web application designed to help you manage, test, and interact with local LLM infrastructure. It provides a comprehensive web UI for working with llama.cpp servers, vector databases, implment you own personal AI agent, and various data conversion tools.

Prerequisites

Before using AI Tools, you need to install the following dependencies separately:

  1. llama.cpp - Must be installed and available as a binary command (llama-server must be in your PATH)

  2. Ollama - Required for embedding model generation (used by the Vector DB features)

    • Install from: https://ollama.ai/
    • Used for generating embeddings when vectorizing documents
  3. ChromaDB - Local vector database (optional, for Vector DB features)

    • Can be run locally or connected to a remote instance

Getting Started

To start developing with AI Tools, you will need:

  • rustc > 1.74
  • node > 20.9.0

Clone the project and execute:

cargo run

The interactive CLI will guide you through the installation process.

Features

1. Llama.cpp Server Management

A comprehensive web UI for managing and testing llama.cpp server instances:

  • Model Management: Scan and select from available GGUF models in your cache directory
  • Server Control: Start, stop, and monitor llama.cpp server instances
  • Configuration UI: Configure all llama.cpp server options through the web interface:
    • Context size
    • GPU layers (GPU splitting support)
    • Threads and batch processing
    • Memory mapping options
    • Flash attention
    • And many more advanced options
  • Real-time Logs: View server output and logs in real-time through the web interface
  • Model Selection: Choose from HuggingFace format models (e.g., user/model:quant)

2. AI Agent

A Rust-based AI agent implementation with extensible tool system:

  • Tool System: Switch tools on/off dynamically
    • ChromaDB Tool: Vector database search integration
    • Extensible Architecture: Easy to add new tools
  • Conversation Management: Persistent conversation history using SQLite
  • WebSocket Support: Real-time streaming responses
  • Memory Management: SQLite-based memory system for conversation context
  • Tool Registry: Centralized tool registration and selection system

3. Vector Database (ChromaDB)

A local ChromaDB client with comprehensive management features:

  • Collection Management: Create, configure, and manage vector collections
  • Embedding Models:
    • Configure different embedding models via web UI
    • Uses Ollama for embedding generation (requires Ollama installation)
    • Support for various embedding strategies
  • Document Upload:
    • Upload and vectorize documents with automatic embedding generation
    • Support for different vectorization strategies
    • Metadata management
  • Query Interface:
    • Semantic search across collections
    • Configurable result limits and filters
    • Distance metrics (Cosine, L2, IP)
  • Configuration: Web UI for configuring collections, embedding models, and connection settings

4. Tools Page

A collection of useful data conversion and processing tools:

  • URL to Markdown: Convert web pages to markdown format
  • HTML to Markdown: Paste HTML content and convert to markdown
  • PDF to Markdown: Upload PDF files and extract content as markdown
  • JSON to TOON: Convert JSON data to TOON format for LLM consumption
  • Text to Tokens: Count tokens in any text using GPT-2 tokenizer

Architecture

Backend (Rust + Actix)

  • API Routes: RESTful API for all features
  • WebSocket Support: Real-time communication for agent interactions and server logs
  • Service Layer: Modular service architecture
  • Tool System: Pluggable tool architecture for agent capabilities

Frontend (Astro + Svelte)

  • Component-Based: Modular Svelte components
  • TypeScript: Full type safety
  • Modern UI: Responsive and intuitive interface
  • Real-time Updates: WebSocket integration for live data

Project Structure

ai_tools
ā”œā”€ src
│  ā”œā”€ backend          # Actix backend (Rust)
│  │  ā”œā”€ src
│  │  │  ā”œā”€ api
│  │  │  │  ā”œā”€ agent          # AI agent implementation
│  │  │  │  │  ā”œā”€ tools       # Tool system (ChromaDB, etc.)
│  │  │  │  ā”œā”€ llama_server   # Llama.cpp server management
│  │  │  │  ā”œā”€ chromadb       # ChromaDB client and operations
│  │  │  │  └─ tools          # Data conversion tools API
│  │  │  └─ main.rs
│  │  └─ Cargo.toml
│  ā”œā”€ frontend         # Astro frontend
│  │  ā”œā”€ src
│  │  │  ā”œā”€ components
│  │  │  │  ā”œā”€ agent          # Agent chat interface
│  │  │  │  ā”œā”€ llamaServer    # Llama.cpp server UI
│  │  │  │  ā”œā”€ chromadb       # Vector DB management UI
│  │  │  │  └─ tools          # Conversion tools UI
│  │  │  ā”œā”€ pages
│  │  │  │  ā”œā”€ agent.astro    # Agent page
│  │  │  │  ā”œā”€ database.astro # Vector DB page
│  │  │  │  └─ tools.astro    # Tools page
│  │  │  └─ ...
│  │  └─ package.json
│  └─ main.rs          # CLI entry point
ā”œā”€ Cargo.toml
└─ readme.md

Development

Running the Application

# Development mode (with hot reload)
cargo run

# Production build
cargo run -- --build
cargo run -- --serve

CLI Arguments

--help              # Print help message
--build             # Build production bundle
--serve             # Start production server
--test              # Run tests
--host="127.0.0.1"  # Server host address
--port=8080         # Backend port number
--env=prod/dev      # Environment mode

Technology Stack

Backend:

  • Rust
  • Actix Web
  • SQLite (for agent memory)
  • ChromaDB client

Frontend:

  • Astro
  • Svelte
  • TypeScript
  • Vitest (testing)

Chroma

  • ChromaDB server (via npm)

External Dependencies:

  • llama.cpp (llama-server binary)
  • Ollama (for embeddings)

License

MIT

Top categories

Loading Svelte Themes