LLM-powered browser built with Electron and Svelte 5, featuring tab management and content extraction for AI queries.
This project isn't intended to be a general-purpose browser with an AI feature added on. Its focus is to streamline workflows where browsing, gathering information, and querying LLMs all happen together.
Each tab can be selected as part of a query. This works for webpages, PDFs, file previews, notes, and LLM responses. Instead of copying content manually, you can:
This allows you to build a working set of pages, documents, data sources, and model outputs that evolve with your task.
Along with web pages, the browser supports:
Conversations become modular rather than linear, giving you finer control over how follow-up prompts are shaped and what context is included.
A common workflow is opening several data sources in different tabs. These might include articles, PDFs, LLM summaries, or your own notes. You can inspect each source, select the relevant ones, and ask the model to cross-check or reconcile them. This all happens without switching tools.
For a deep dive into the architecture and design philosophy:
# Install dependencies
npm install
# Run in development
npm run electron:dev
# Build for production
npm run build
npm start
# Run tests
npm test
llm-sv-tabs/
├── src/
│ ├── main/ # Electron main process
│ │ ├── main.ts # Entry point
│ │ ├── tab-manager.ts # Tab management
│ │ ├── preload.ts # IPC preload
│ │ ├── providers/ # LLM providers
│ │ ├── services/ # Content extraction
│ │ └── templates/ # HTML templates
│ └── ui/ # Svelte UI
│ ├── components/ # UI components
│ ├── stores/ # Svelte stores
│ └── lib/ # Utilities
└── tests/ # Fast unit/integration tests
Comprehensive test suite with 80+ tests running in < 10 seconds:
See TESTING.md for details.
MIT