About
The MCP LLM Bridge provides a bidirectional protocol translation layer, enabling seamless communication between MCP servers and OpenAI-compatible language models (LLMs). It converts MCP tool specifications into OpenAI function schemas and maps function invocations back to MCP tool executions. This allows any OpenAI-compatible LLM, whether cloud-based or local (like Ollama), to leverage MCP-compliant tools through a standardized interface, extending the capabilities of both systems.
Key Features
- Supports OpenAI API and local endpoints implementing the OpenAI API specification.
- Configuration options for OpenAI, Ollama, and LM Studio.
- Enables any OpenAI-compatible LLM to use MCP-compliant tools.
- Bidirectional protocol translation between MCP and OpenAI function calling.
- 288 GitHub stars
- Allows querying of databases via LLMs.
Use Cases
- Using OpenAI models to interact with MCP-compliant tools.
- Integrating local LLMs with MCP servers.
- Extending the capabilities of existing LLMs with MCP tools.