Simple Ollama Bridge
Connects Model Context Protocol (MCP) servers to OpenAI-compatible LLMs like Ollama.
Acerca de
The Simple Ollama Bridge facilitates communication between Model Context Protocol (MCP) servers and Large Language Models (LLMs) such as Ollama. It enables users to leverage the capabilities of local LLMs within an MCP environment, supporting functionalities like resource access, prompt engineering, tool integration, and custom sampling strategies, by acting as a bridge between the MCP server and the LLM's API endpoint.
Características Principales
- 2 GitHub stars
- Configurable LLM endpoint.
- Supports local LLM deployment.
- Bridges MCP servers to OpenAI-compatible LLMs (e.g., Ollama).
- Compatible with endpoints adhering to the OpenAI API specification.
Casos de Uso
- Customizing LLM configurations for MCP server interactions.
- Integrating local LLMs with MCP workflows.
- Testing and experimenting with MCP functionalities using Ollama.