Vox serves as a versatile multi-model AI gateway for Model Context Protocol (MCP) clients, freeing them from the constraints of single-host models. It grants access to a wide array of AI providers including Google Gemini, OpenAI, Anthropic, xAI, DeepSeek, Moonshot, OpenRouter, and custom local models like Ollama. The design prioritizes minimalism, ensuring prompts and responses are passed through unmodified, without any system prompt injection or behavioral directives. Vox's core value lies in its intelligent routing and robust conversation memory, allowing multi-turn exchanges to persist and even transition across different AI providers.
Características Principales
01Supports file and image context for prompts
02Maintains conversation memory across different AI providers
03Exports conversation threads to JSON or Markdown
04Connects to 8 major AI providers plus custom local models
05Pure passthrough for prompts and responses, no system prompt injection
060 GitHub stars