Visit maichat.io to use MaiChat directly in your browser. No downloads, no setup - just add your API keys and start chatting.
Hosted on Vercel with automatic HTTPS, global CDN, and 99.99% uptime.
Deploy your own instance:
-
Vercel: Fork this repo, connect to Vercel, auto-deploys on push
-
Other hosts: Serve the production build from
dist/on any static host (Netlify, Cloudflare Pages, etc.) src="public/maichat-logo.png" alt="MaiChat Logo" width="128" height="128">
Keyboard-First LLM Client for Power Users
A unified interface for GPT-4o, Claude, Gemini, and Grok with advanced conversation organization, web search, image support, and precise context control.
- π³ Topic Tree Organization - Structure conversations in a hierarchical topic system, like files in folders
- β¨οΈ Keyboard-Centric Workflow - Vim-inspired modal interface (Input/View/Command), zero mouse required
- π Powerful Search & Filtering - CLI-style query language with boolean operators, filter by topic/model/date/content/rating
- π― Context Control - Visual context boundary, precise token management, filter before sending
- π€ Multi-Model Support - OpenAI (GPT-5.1/mini/nano), Anthropic Claude 4-5, Google Gemini 3/2.5, xAI Grok 4-1 β all in one interface
- πΌοΈ Image Attachments - Send screenshots and images to vision-capable models (GPT-5, Claude, Gemini, Grok)
- π Web Search - Native search integration for all providers with citations
- π Activity Statistics - Track usage by date, model, and topic with response time analytics
- π PDF Export - Export conversations to formatted PDF with customizable layout
- π Privacy First - 100% client-side, all data stored locally, no backend, open source (MIT) A keyboard-first, minimal, client-side app for organizing and running conversations with multiple LLMs in one unified interface.
- Topic tree: structure any message into a hierarchical topic system.
- Flexible context: filter/supplement whatβs sent to the model.
- Command-line style filtering: fast, composable commands for context control.
- Keyboard-centric: operate without a mouse; distraction-free UI.
- Pure client: vanilla JS, built with Vite; no server required.
End users donβt need Node.js. Open the deployed site (e.g., GitHub Pages) and use MaiChat directly in the browser. If you self-host, serve the production build in dist/ on any static web host.
- Node.js β₯ 18
- npm (comes with Node.js)
# Clone the repository
git clone https://github.com/ebuyakin/maichat.git
cd maichat
# Install dependencies
npm ci
# Start dev server (Vite)
npm run dev
# Open http://localhost:5173# Create optimized production build
npm run build
# Preview production build locally
npm run preview# Run tests once
npm test
# Watch mode for development
npm run test:watch# Format code with Prettier
npm run format
# Build tutorial HTML
npm run tutorial:build
# Watch tutorial for changes
npm run tutorial:watch- Create hierarchical topic structures (like folders)
- Assign any message to any topic, anytime
- Custom system messages per topic for AI behavior
- Topic-specific model parameters (temperature, max tokens)
- Never lose track of conversations across different themes
- Modal design (Input/View/Command modes) - keys are contextual, no conflicts
- Vim-inspired navigation - j/k scroll, u/d jump messages, g/G first/last
- One-key actions - copy code (Y), rate messages (1-5), toggle flags (B/G/R)
- Fast topic/model switching - Ctrl+T, Ctrl+M
- Mouse optional - fully keyboard-driven, but mouse works when needed
- CLI-style queries -
t'work' & d<7d | s>=3(topic "work" AND last 7 days OR 3+ stars) - Multi-dimensional - filter by topic, model, date, content, rating, color flags, images
- Boolean operators - AND (&), OR (|), NOT (!), grouping with parentheses
- Command history - Ctrl+P/N to reuse previous queries
- Instant results - see filtered view immediately
- Visual boundary - see exactly what's included in context
- Token budget - real-time calculation with model-specific limits
- Filter before sending - include only relevant messages
- Smart trimming - automatic overflow handling with trim indicators
- Multi-model - compare context sizes across different models
- OpenAI: GPT-5.1, GPT-5 mini, GPT-5 nano (400K context)
- Anthropic: Claude Sonnet 4-5, Claude Opus 4-5, Claude Haiku 4-5 (200K context)
- Google: Gemini 3 Pro Preview, Gemini 2.5 Pro, Gemini 2.5 Flash (1M context)
- xAI: Grok 4-1 Fast (reasoning & non-reasoning), Grok Code Fast 1 (2M context)
- Easy model switching, per-topic defaults
maichat/
βββ public/ # Static assets (HTML, favicon, logo)
βββ src/ # Application source code
β βββ core/ # Store, models, settings, persistence
β βββ features/ # UI features (history, topics, commands, etc.)
β βββ runtime/ # Bootstrap, lifecycle management
β βββ shared/ # Shared utilities and components
β βββ styles/ # CSS modules
β βββ main.js # Application entry point
βββ docs/ # Design docs, specs, ADRs
βββ tests/ # Unit tests (Vitest)
βββ tutorial.html # Built tutorial (generated)
- Tutorial - Interactive getting started guide (Ctrl+Shift+H in app)
- Keyboard Reference - Complete shortcut list (F1 in app)
- CLI Filtering Language - Query syntax specification
- Architecture - System design, runtime/UI layers, data model
- Project Vision - Goals, principles, design philosophy
- Topic System - Topic tree concepts and operations
- ADRs - Architecture Decision Records
- UI Layout - Zone structure and alignment
- Scroll Positioning - Scroll behavior specification
- Focus Management - Modal isolation and focus traps
- New Message Workflow - Send/reply lifecycle
- 100% Client-Side - No backend; we do not collect or store your conversations
- Local Storage - All conversations stored in browser IndexedDB
- Your Keys, Your Control - API keys stored in localStorage, sent only to providers you choose
- Privacy-Friendly Analytics (landing pages only) - We use Vercel Web Analytics on index/tutorial pages to measure visits. Itβs cookieβless and does not collect personal data. The inβapp chat UI has no analytics.
- Open Source - MIT licensed, audit the code yourself
Contributions are welcome! Please feel free to submit issues or pull requests.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
MIT License - see LICENSE file for details.
- Markdown rendering: marked
- Math rendering: KaTeX
- Code highlighting: Prism.js
- Sanitization: DOMPurify
- Build tool: Vite
- Website: maichat.io
- Repository: github.com/ebuyakin/maichat
- Issues: github.com/ebuyakin/maichat/issues
- Changelog: CHANGELOG.md