Establishes a bridge between Model Context Protocol clients and local, OpenAI-compatible Large Language Model services.
Custom is a Model Context Protocol (MCP) server designed to facilitate seamless integration of MCP-compatible applications with your self-hosted, OpenAI-compatible Large Language Model (LLM) services. It translates MCP requests for various tasks, including chat completion, model listing, and health checks, to your local LLM. With robust features like streaming support, pre-built prompts for common AI tasks, and environment-based configuration, Custom serves as a vital component for developers looking to leverage local LLMs within the MCP ecosystem.