Local.ai is a desktop application designed for secure, private, and cost-free AI experimentation by running Large Language Models (LLMs) locally on your PC. It offers an integrated model API and downloader with essential details like hardware specs and licenses, alongside a simple note-taking app that allows for per-note inference configuration. The tool also provides a streaming inference server with an OpenAI-compatible /completion endpoint, making it an ideal companion for projects like window.ai to empower web applications with local AI capabilities.