01LLM wiki with binary document ingestion (PDF, DOCX, PPTX, XLSX, images)
02Local daemon with REST and WebSocket API for seamless integration
03Shared persistent AI memory, provider routing, and tool execution
04Multiple interfaces: CLI, TUI, native desktop app, and MCP server/client
0518 GitHub stars
06Support for 6+ AI providers including OpenAI, Anthropic, Gemini, and Ollama