This tool provides a robust, dual-transport Model Context Protocol (MCP) server designed to bridge your existing APIs with various LLM clients. It supports both local (stdio) transport for desktop applications like Claude Desktop, Cursor, and Windsurf, and remote (HTTP/SSE) transport for web clients and services like OpenAI Responses API. Built with the official MCP Python SDK and FastAPI, it ensures proper architecture, including strict JSON schemas for deterministic tool behavior, authentication, rate limiting, security best practices, and adherence to SOLID principles, enabling seamless and secure integration of custom functionalities into AI workflows.
主要功能
01Leverages FastAPI for HTTP/SSE transport
02Uses the official MCP Python SDK for stdio transport
03Includes authentication, rate limiting, and security best practices
04Supports dual transports: stdio (local) and HTTP/SSE (remote)
05Exposes tools with strict JSON schemas for deterministic behavior
060 GitHub stars