A Svelte 5 ebook reader with AI-powered chat features, showcasing integration with the CypherTap package for nostr and eCash.
┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ Svelte Frontend │────▶│ FastAPI Backend │────▶│ LangGraph Agent │
│ (Port 5173) │ │ (Port 8000) │ │ (Port 2024) │
└─────────────────┘ └──────────────────┘ └─────────────────┘
frontend/ - Svelte 5 ebook reader applicationbackend/ - FastAPI backend for business logic and LangGraph integrationagent/ - Self-hosted LangGraph agent for AI chatcyphertap/ - CypherTap component library (submodule)docs/ - Documentationcd agent
pip install -e . "langgraph-cli[inmem]"
cp .env.example .env
# Add your OPENAI_API_KEY to .env
langgraph dev
cd backend
pip install -e .
cp .env.example .env
uvicorn src.main:app --reload --port 8000
cd frontend
pnpm install
cp .env.example .env
pnpm dev
Open http://localhost:5173 to use the app.
The AI chat feature allows users to:
The AI assistant receives context about:
agent/.env)OPENAI_API_KEY - OpenAI API key (required)MODEL_PROVIDER - openai or anthropicMODEL_NAME - Model to use (e.g., gpt-4o-mini)backend/.env)LANGGRAPH_API_URL - LangGraph server URL (default: http://localhost:2024)LANGGRAPH_ASSISTANT_ID - Graph ID (default: reader_assistant)frontend/.env)VITE_API_URL - Backend API URL (default: http://localhost:8000)See individual README files in each directory for detailed development instructions:
# / frontend
git submodule update --init --recursive
npm run dev
# /agent
pip install -e . "langgraph-cli[inmem]"
langgraph dev --no-browser
# /backend
pip install -e .
uvicorn src.main:app --reload --port 8000