Local AI on Windows icon

Local AI on Windows

1

Enables a self-hosted AI environment on Windows, integrating Ollama, Open WebUI, and MCP for local language model management and chat interaction.

About

This project provides a complete self-hosted AI stack for Windows, combining the power of Ollama for running language models locally, Open WebUI for a user-friendly chat interface, and MCP for centralized model management. It includes a sample MCP-based tool server for managing employee leave, exposed via OpenAPI for seamless integration. This setup offers full control, privacy, and flexibility without relying on cloud services.

Key Features

  • Apply for leave on specific dates
  • View leave history
  • Personalized greeting functionality
  • 1 GitHub stars
  • Check employee leave balance

Use Cases

  • Run language models locally on Windows
  • Build a self-hosted AI chatbot
  • Manage employee leave using a local tool server