01OpenAI-compatible streaming inference server (/completion endpoint)
02689 GitHub stars
03Integrated model API and downloader with hardware specs, licenses, and hashes
04Local AI model inference on desktop via Rust's llm crate
05Simple note-taking app with per-note inference configuration
06Supports local, private, and secured AI experimentation