Provides advanced code analysis and reasoning capabilities by orchestrating Google's Gemini AI and Claude Code for complex debugging and understanding distributed systems.
This Model Context Protocol (MCP) server combines the strengths of Claude Code and Google's Gemini AI to offer sophisticated code analysis. It enables a multi-model workflow where Claude handles local context and CLI-native operations, while Gemini leverages its massive 1M token context window and code execution abilities for deep dives into distributed system debugging, long-trace analysis, and performance modeling. The server implements an intelligent routing strategy, treating LLMs as heterogeneous microservices to route tasks to the most capable model for a given sub-task, including novel AI-to-AI conversational analysis.