01Cross-platform support for Linux, macOS, and Windows
02Local-first AI inference using Ollama models
030 GitHub stars
04Provides reasoning, embeddings, and document filtering tools
05Adheres to the MCP standard for broad client compatibility
06Configurable context window size for reasoning models