关于
This project provides a complete self-hosted AI stack for Windows, combining the power of Ollama for running language models locally, Open WebUI for a user-friendly chat interface, and MCP for centralized model management. It includes a sample MCP-based tool server for managing employee leave, exposed via OpenAPI for seamless integration. This setup offers full control, privacy, and flexibility without relying on cloud services.
主要功能
- Apply for leave on specific dates
- View leave history
- Personalized greeting functionality
- 1 GitHub stars
- Check employee leave balance
使用案例
- Run language models locally on Windows
- Build a self-hosted AI chatbot
- Manage employee leave using a local tool server