title: DNG emoji: š¤ colorFrom: indigo colorTo: purple sdk: docker app_port: 7860 pinned: false
A full-stack LLM agent that responds in your voice with your expertise.
Backend: FastAPI + SQLite (persistent chat history) + Cerebras (primary) + OpenRouter (fallback)
Frontend: Svelte 5 (Runes) + Vite
my-agent/
āāā backend/ FastAPI app (main.py, agent.py, memory.py)
ā āāā llm_clients.py # Cerebras + OpenRouter client factories
ā āāā Dockerfile Multi-stage build (frontend ā backend)
āāā frontend/ Svelte 5 SPA
āāā docker-compose.yml
āāā README.md
| Tool | Version |
|---|---|
| Python | ā„ 3.11 |
| uv | latest |
| Node.js | 20 LTS |
cd backend
uv venv
# Windows
.venv\Scripts\activate
# macOS / Linux
source .venv/bin/activate
uv sync
cp .env.example .env # set CEREBRAS_* (primary) and OPENROUTER_* (fallback)
uv run uvicorn main:app --reload
cd frontend
npm install
npm run dev
The Vite dev server proxies /chat, /history, /health to the FastAPI backend on port 8000.
Build the frontend and run everything in one container:
cd my-agent
docker build -f backend/Dockerfile -t my-agent .
docker run -p 7860:7860 --env-file backend/.env my-agent
Or use Compose (includes hot-reload backend + optional Vite dev server):
docker compose up # production-like
docker compose --profile dev up # + Vite dev server on :5173
Create backend/.env:
CEREBRAS_API_KEY=... # primary; omit to use OpenRouter only
CEREBRAS_MODEL=gpt-oss-120b # optional override
OPENROUTER_API_KEY=sk-or-... # required fallback when Cerebras fails
OPENROUTER_MODEL=anthropic/claude-3.5-sonnet # optional override
APP_URL=https://your-hf-space-url # optional, for HTTP-Referer header
Edit backend/agent.py and fill in the SYSTEM_PROMPT:
SYSTEM_PROMPT = """
You are [Your Name]'s AI double. You respond exactly as [Your Name] would,
using their tone, expertise in [domains], and writing style.
...
"""
my-agent/ to the Space repo rootREADME.md frontmatter above is already correct).CEREBRAS_API_KEY (primary) and OPENROUTER_API_KEY (fallback) in Settings ā Repository secrets.docker build -f Dockerfile . from the root ābackend/Dockerfile to the repo root, or add a root-levelDockerfile that delegates to it.| Method | Path | Description |
|---|---|---|
POST |
/chat |
SSE stream. Body: { session_id, message } |
GET |
/history/{session_id} |
Full message history for a session |
GET |
/health |
Liveness check |