LibreModel icon

LibreModel

Connects Claude Desktop to a local Large Language Model (LLM) instance powered by llama-server.

소개

Acts as a Model Context Protocol (MCP) server, enabling users of Claude Desktop to seamlessly interact with local Large Language Models (LLMs) served by llama-server. This bridge translates MCP messages into llama-server API calls, providing full conversation support, comprehensive parameter control, and performance tracking for local AI experiences within the familiar Claude Desktop environment.

주요 기능

  • Full conversation support with local LLMs via Claude Desktop
  • Comprehensive parameter control (temperature, max_tokens, top_p, top_k)
  • Health monitoring and server status checks
  • Built-in testing tools for LLM capabilities
  • Easy configuration through environment variables
  • 2 GitHub stars

사용 사례

  • Engaging in conversations with locally hosted LLMs through Claude Desktop.
  • Testing the capabilities and performance of local AI models.
  • Monitoring the operational status of a local LLM server.
Advertisement

Advertisement