LibreModel
Connects Claude Desktop to a local Large Language Model (LLM) instance powered by llama-server.
About
Acts as a Model Context Protocol (MCP) server, enabling users of Claude Desktop to seamlessly interact with local Large Language Models (LLMs) served by llama-server. This bridge translates MCP messages into llama-server API calls, providing full conversation support, comprehensive parameter control, and performance tracking for local AI experiences within the familiar Claude Desktop environment.
Key Features
- Full conversation support with local LLMs via Claude Desktop
- Comprehensive parameter control (temperature, max_tokens, top_p, top_k)
- Health monitoring and server status checks
- Built-in testing tools for LLM capabilities
- Easy configuration through environment variables
- 2 GitHub stars
Use Cases
- Engaging in conversations with locally hosted LLMs through Claude Desktop.
- Testing the capabilities and performance of local AI models.
- Monitoring the operational status of a local LLM server.