Vercel AI
CreateditsDarianNgo
Integrates Vercel AI SDK with an MCP server to support OpenAI and Mistral LLM providers.
About
This tool provides an MCP server implementation designed to work with the Vercel AI SDK, enabling seamless integration with both OpenAI and Mistral large language models. It supports features like structured output, system prompts (for OpenAI), and safe prompts, system prompts (for Mistral), allowing developers to easily leverage these powerful AI models within their MCP-based applications. The server is built with TypeScript and requires Node.js 18 or higher, along with the appropriate API keys for OpenAI and Mistral.
Key Features
- Supports OpenAI's gpt-4-turbo, gpt-4, and gpt-3.5-turbo models.
- Supports Mistral's mistral-large-latest, mistral-small-latest, and pixtral-large-latest models.
- Offers structured output for OpenAI.
- Enables system prompts for both OpenAI and Mistral.
- Provides safe prompts for Mistral.
Use Cases
- Integrating OpenAI models into MCP applications.
- Integrating Mistral models into MCP applications.
- Building AI-powered features with Vercel AI SDK and MCP.