Cross-LLM
byJamesANZ
0Provides a unified Model Context Protocol server to access and combine responses from multiple Large Language Model APIs, including ChatGPT, Claude, and DeepSeek.
Acerca de
Cross-LLM is a Model Context Protocol (MCP) server designed to streamline interaction with various Large Language Models. It acts as a central hub, allowing users to call individual LLMs like ChatGPT, Claude, and DeepSeek, or simultaneously query all integrated models for combined responses. This facilitates diverse use cases such as multi-perspective analysis, model comparison, and enhanced reliability by leveraging multiple AI providers through a single, standardized interface.
Características Principales
- Dynamic selection of LLM provider via a single tool
- 0 GitHub stars
- Simultaneous querying of all integrated LLMs for combined responses
- Configurable model, temperature, and token limits for each LLM call
- Detailed output including model information and token usage statistics
- Dedicated tools for individual LLM interaction (ChatGPT, Claude, DeepSeek)
Casos de Uso
- Compare the strengths and weaknesses of various LLMs for specific tasks or outputs.
- Enhance reliability and introduce redundancy by leveraging multiple LLM providers.
- Perform multi-perspective analysis by comparing responses from different LLMs on the same prompt.