关于
Custom is a Model Context Protocol (MCP) server designed to facilitate seamless integration of MCP-compatible applications with your self-hosted, OpenAI-compatible Large Language Model (LLM) services. It translates MCP requests for various tasks, including chat completion, model listing, and health checks, to your local LLM. With robust features like streaming support, pre-built prompts for common AI tasks, and environment-based configuration, Custom serves as a vital component for developers looking to leverage local LLMs within the MCP ecosystem.
主要功能
- Integration with any OpenAI-compatible LLM service
- Supports chat completion, model listing, and health checks
- Full implementation of Model Context Protocol
- 0 GitHub stars
- Configurable via environment variables
- Provides both streaming and non-streaming responses
使用案例
- Routing and managing AI requests to a self-hosted Large Language Model backend.
- Integrating local LLM services with MCP client applications like Claude Desktop.
- Developing and testing AI applications that utilize the Model Context Protocol standard.