Exposes an API as tools to large language model clients using the Model Context Protocol (MCP).
This tool provides a robust, dual-transport Model Context Protocol (MCP) server designed to bridge your existing APIs with various LLM clients. It supports both local (stdio) transport for desktop applications like Claude Desktop, Cursor, and Windsurf, and remote (HTTP/SSE) transport for web clients and services like OpenAI Responses API. Built with the official MCP Python SDK and FastAPI, it ensures proper architecture, including strict JSON schemas for deterministic tool behavior, authentication, rate limiting, security best practices, and adherence to SOLID principles, enabling seamless and secure integration of custom functionalities into AI workflows.