Connects Model Context Protocol (MCP) servers to OpenAI-compatible LLMs like Ollama.
The Simple Ollama Bridge facilitates communication between Model Context Protocol (MCP) servers and Large Language Models (LLMs) such as Ollama. It enables users to leverage the capabilities of local LLMs within an MCP environment, supporting functionalities like resource access, prompt engineering, tool integration, and custom sampling strategies, by acting as a bridge between the MCP server and the LLM's API endpoint.
主要功能
012 GitHub stars
02Configurable LLM endpoint.
03Supports local LLM deployment.
04Bridges MCP servers to OpenAI-compatible LLMs (e.g., Ollama).
05Compatible with endpoints adhering to the OpenAI API specification.
使用案例
01Customizing LLM configurations for MCP server interactions.
02Integrating local LLMs with MCP workflows.
03Testing and experimenting with MCP functionalities using Ollama.