This project provides a complete self-hosted AI stack for Windows, combining the power of Ollama for running language models locally, Open WebUI for a user-friendly chat interface, and MCP for centralized model management. It includes a sample MCP-based tool server for managing employee leave, exposed via OpenAPI for seamless integration. This setup offers full control, privacy, and flexibility without relying on cloud services.