Rubber Duck
Bridges multiple OpenAI-compatible LLMs, providing an interface for querying various AI perspectives with tool access.
Acerca de
Embrace the debugging spirit with Rubber Duck, an innovative MCP server designed to connect you with a council of AI 'ducks.' Just as you'd explain a problem to a rubber duck, this tool allows you to pose questions and receive insights from multiple OpenAI-compatible LLMs simultaneously. Get diverse perspectives, leverage specialized models, and even maintain conversational context across different AI providers, making complex problem-solving and ideation more collaborative and insightful.
Características Principales
- Query multiple OpenAI-compatible LLM providers simultaneously
- Maintain conversation context across multiple messages and providers
- Receive responses from all configured LLMs at once (Duck Council)
- Cache API responses to avoid duplicate calls
- Monitor LLM provider health with automatic failover
- 36 GitHub stars
Casos de Uso
- Maintain a debugging conversation with an AI, switching providers as needed
- Convene a 'Duck Council' to gather diverse architectural or design recommendations
- Compare explanations or solutions to a problem from various AI models