关于
AI agents connecting to multiple MCP servers often face excessive token consumption, significant latency due to processing unneeded tool definitions, and increased hallucinations from information overload. MCP of MCPs addresses these challenges by acting as a meta-server that aggregates various MCP servers. It provides three powerful tools: `get_mcps_servers_overview` for lightweight discovery, `get_tools_overview` for on-demand loading of specific tools, and `run_functions_code` for direct, in-memory data flow between tools. This progressive disclosure approach dramatically reduces token usage, speeds up execution by avoiding unnecessary model processing, and enhances accuracy by presenting only relevant information at each stage of an AI agent's workflow.