Self-hosted, local-first CV and cover letter generator powered by AI. Your data stays on your machine — no cloud, no account, no subscription.
⚠️ Self-hosted only. ApplyKit is designed to run locally or on your own server. It has no authentication — do not expose it to the public internet without adding your own auth layer.
Browser (localhost:5173)
│
▼
SvelteKit Frontend
- Profile editor (multi-profile)
- CV preview & export
- Cover letter editor
- Fit analysis
- Job tracker (Kanban)
- History browser
│ HTTP (REST API)
▼
FastAPI Backend (localhost:8000)
- Profile CRUD (SQLite, multi-profile)
- CV import (PDF/DOCX/text → LLM extraction)
- ATS CV enhancement (LLM)
- Cover letter generation (LLM)
- Fit score analysis (LLM)
- Job URL scraping (Jina + Crawl4AI)
- PDF export (WeasyPrint)
- Generation history
│
▼
LiteLLM → any LLM provider (Gemini, OpenAI, Anthropic, Ollama...)
| Layer | Technology |
|---|---|
| Frontend | SvelteKit 2 + Svelte 5 (runes) + TypeScript |
| Styling | Tailwind CSS v4 + shadcn-svelte |
| Backend | FastAPI + Python 3.12 |
| Database | SQLite via SQLAlchemy 2.0 + Alembic |
| AI | LiteLLM (configured via UI Settings) |
| WeasyPrint (server-side) + browser print (client-side) | |
| CV parsing | pdfplumber (PDF), python-docx (DOCX) |
| Job scraping | Jina Reader (primary) + Crawl4AI (fallback) + LLM parsing |
| Package managers | uv (Python), Bun (JavaScript) |
apt-get install libcairo2 libpango-1.0-0 libgdk-pixbuf2.0-0 libffi7 shared-mime-infobrew install cairo pango gdk-pixbufgit clone https://github.com/wihlarkop/applykit.git
cd applykit
docker compose up --build
Open http://localhost:3000 — your data is stored in a Docker volume and persists across restarts.
# 1. Clone
git clone https://github.com/wihlarkop/applykit.git
cd applykit
# 2. Configure
cp backend/.env.example backend/.env
# 3. Install dependencies
make install
# 4. Run database migrations
make migrate
# 5. Start (two separate terminals)
make backend # http://localhost:8000
make frontend # http://localhost:5173
Open http://localhost:5173 — you'll be guided through setup on first launch.
docker compose up --build
http://localhost:3000http://localhost:8000applykit_applykit-data)If you're running on a server instead of localhost, set VITE_API_BASE_URL to your domain before building:
VITE_API_BASE_URL=https://api.yourdomain.com/api docker compose up --build
Or edit the args block in docker-compose.yml:
args:
VITE_API_BASE_URL: https://api.yourdomain.com/api
VITE_API_BASE_URLis baked into the frontend at build time — the browser uses it to reach the backend. It must be publicly reachable from the user's machine.
SQLite is stored in a Docker volume. To export it:
docker run --rm \
-v applykit_applykit-data:/data \
-v $(pwd):/backup \
alpine tar czf /backup/applykit-backup.tar.gz /data
docker compose up --build # Build and start
docker compose up -d # Start in background
docker compose down # Stop
docker compose logs -f backend # Follow backend logs
docker compose exec backend uv run alembic upgrade head # Run migrations manually
git clone https://github.com/your-username/applykit.git
cd applykit
cd backend
cp .env.example .env
uv sync
uv run alembic upgrade head
uv run main.py
# API: http://localhost:8000
# Swagger UI: http://localhost:8000/docs
cd ../frontend
bun install
bun run dev
# Frontend: http://localhost:5173
make install # Install all dependencies (backend + frontend)
make migrate # Run database migrations
make backend # Start backend server (http://localhost:8000)
make frontend # Start frontend dev server (http://localhost:5173)
make lint # Lint frontend TypeScript/Svelte
make migrate-new MSG="description" # Create a new migration
make migrate-down # Roll back the last migration
make help # Show all commands
Edit backend/.env to change the database path:
DATABASE_URL=sqlite:///./applykit.db
SQLite is the default and requires no additional setup. PostgreSQL is on the roadmap.
LLM configuration (provider, API key, model) is managed via the Settings page in the UI — no need to edit .env manually.
Click the Settings icon (gear) in the top navigation to connect a provider. You can connect multiple providers and switch between them at any time.
No API key? The app still works. CV generation falls back to your raw profile data without AI enhancement. Import, cover letter generation, and fit analysis require an LLM to be configured.
On first launch you'll be guided through setup. Fill in:
Or use AI Sync (sparkle button on the profile page) to upload an existing CV and auto-fill everything instantly.
Paste a job description in the Cover Letter page and click Analyze Fit to see:
Go to Tracker to add jobs, drag cards between stages, and link generated CVs and cover letters to each application.
Go to History to see every generated CV and cover letter. Filter by profile, search, sort by match score, and preview or re-download any entry.
| Platform | Status |
|---|---|
| Greenhouse | ✅ Supported |
| Lever | ✅ Supported |
| Ashby | ✅ Supported |
| JazzHR | 📋 Planned |
| BambooHR | 📋 Planned |
| Workday | 📋 Planned (requires browser automation) |
For boards without direct API support, Smart Apply uses Jina to scrape the page and an LLM to extract structured fields. This works on most job sites.
ApplyKit has no built-in authentication. It is designed to run:
localhost for personal use (default)Do not expose ApplyKit to the public internet without putting an auth proxy (e.g. Authelia, Nginx basic auth, Cloudflare Access) in front of it.
All LLM API keys are stored in your local SQLite database and never leave your machine.
Items marked ✅ are shipped. Items marked 📋 are planned.
| Status | Feature |
|---|---|
| ✅ | Multi-profile support |
| ✅ | ATS CV generation with job description tailoring |
| ✅ | AI cover letter generation |
| ✅ | CV import from PDF/DOCX |
| ✅ | Fit score analysis (match %, strengths, gaps, red flags) |
| ✅ | Job URL scraper (Greenhouse, Lever, Ashby + generic) |
| ✅ | Smart Apply (URL → CV + CL in one flow) |
| ✅ | Job application tracker (Kanban) |
| ✅ | Generation history with search and filters |
| ✅ | PDF export (WeasyPrint server-side + browser print) |
| ✅ | LLM usage log with token/cost tracking |
| ✅ | Docker Compose — one-command install |
| 📋 | Docker: nginx reverse proxy so frontend + backend share port 80 |
| 📋 | Docker: multi-arch builds (arm64 for Apple Silicon / Raspberry Pi) |
| 📋 | Docker: pre-built images on GitHub Container Registry (ghcr.io) |
| 📋 | PostgreSQL support |
| ✅ | Real-time CV split-screen preview |
| 📋 | One-click portfolio site generator |
| Status | Provider |
|---|---|
| ✅ | Gemini (Google AI Studio) |
| ✅ | OpenAI |
| ✅ | Anthropic |
| ✅ | Ollama (local, offline) |
| ✅ | Any LiteLLM-compatible provider |
| ✅ | Connect / disconnect providers from UI |
| ✅ | Switch active provider with confirmation |
| Status | Feature |
|---|---|
| ✅ | Dark mode |
| ✅ | Mobile-responsive layout |
| ✅ | Toast notifications |
| ✅ | Skeleton loading states |
| ✅ | Onboarding flow |
| ✅ | Profile color + icon picker |
| ✅ | Confirm before overwriting profile via import |
| ✅ | Warn when profile switch clears in-progress cover letter |
| ✅ | Inline delete confirmation (no accidental deletions) |
| ✅ | Tracker error state and no-results message |
| ✅ | Fit analysis retry button |
| ✅ | Red flags visually distinct from cons |
| ✅ | Profile badge on CV history cards |
| ✅ | Usage table mobile responsive |
| 📋 | Keyboard shortcuts |
| 📋 | Bulk export history |
| Feature | Description |
|---|---|
| LinkedIn optimizer | AI-generated headlines and About sections |
| Multi-language CV | One-click translation of the full profile |
| Outreach generator | LinkedIn cold messages, recruiter emails, follow-ups |
| Interview coach | Practice elevator pitch with voice feedback (Web Speech API) |
| Browser extension | One-click Smart Apply from any job board |
Contributions are welcome! Here's how to get started:
git checkout -b feat/your-featurebun install
bun x lefthook install
This wires up automatic linting and formatting on every git commit. Requires uv to be installed.MIT — see LICENSE