Custom icon

Custom

Establishes a bridge between Model Context Protocol clients and local, OpenAI-compatible Large Language Model services.

Acerca de

Custom is a Model Context Protocol (MCP) server designed to facilitate seamless integration of MCP-compatible applications with your self-hosted, OpenAI-compatible Large Language Model (LLM) services. It translates MCP requests for various tasks, including chat completion, model listing, and health checks, to your local LLM. With robust features like streaming support, pre-built prompts for common AI tasks, and environment-based configuration, Custom serves as a vital component for developers looking to leverage local LLMs within the MCP ecosystem.

Características Principales

  • Integration with any OpenAI-compatible LLM service
  • Supports chat completion, model listing, and health checks
  • Full implementation of Model Context Protocol
  • 0 GitHub stars
  • Configurable via environment variables
  • Provides both streaming and non-streaming responses

Casos de Uso

  • Routing and managing AI requests to a self-hosted Large Language Model backend.
  • Integrating local LLM services with MCP client applications like Claude Desktop.
  • Developing and testing AI applications that utilize the Model Context Protocol standard.
Advertisement

Advertisement