Llm Wrapper
bymatdev83
0Enables MCP-capable LLM agents to communicate and delegate tasks to other LLMs available through the OpenRouter.ai API.
About
The LLM Wrapper is a Model Context Protocol (MCP) server that allows any MCP-capable LLM agent to seamlessly communicate with and delegate tasks to other LLMs available through the OpenRouter.ai API. This is achieved through a standardized interface, providing a robust and flexible server that handles LLM calls, tool execution, and result processing. It integrates with llm-accounting for logging, rate limiting, and auditing.
Key Features
- Provides a FastAPI-based server for handling LLM requests and responses.
- Integrates with `llm-accounting` for robust logging, rate limiting, and auditing.
- 0 GitHub stars
- Supports advanced features like tool calls and results through the MCP protocol.
- Implements the Model Context Protocol (MCP) specification.
- Configurable to use various LLM providers (e.g., OpenRouter).
Use Cases
- Adding monitoring and auditing to LLM interactions.
- Enabling LLM agents to delegate tasks to specialized LLMs via OpenRouter.
- Creating a standardized interface for interacting with multiple LLM providers.