chatTT Svelte Themes

Chattt

Local AI chatbots, powered by Ollama

chatTT - Local Ollama AI in the browser

This project is a way to chat and organize conversations with local Ollama AI in the browser. All the power of an llm, where no data leaves the machine.

Requirements

Ollama - The core of chatTT. Must be installed on your OS. Our default model wizardlm2 must be installed as well (eventually hot swapping llms will be available in the app).

Ollama MUST BE configured to allow https://chattt.captainbrando.com as an allowed origin for CORS. For windows, quit the ollama app via the docked icon, and run the following in powershell

$env:OLLAMA_ORIGINS="https://chattt.captainbrando.com"
ollama serve

The server should begin and allow chatTT to work

Getting Started

Clone the repo. Then:

npm i
npm run dev

That should get the app opened. If the console is reporting issues, make sure Ollama is installed and our default model wizardlm2 has been downloaded.

ollama list
ollama pull wizardlm2:latest
ollama run wizardlm2

If those commands work for you, the web app should too.

Top categories

Loading Svelte Themes