Unifies access to multiple large language models and providers through a single, token-efficient Model Context Protocol server, supporting composable workflows and flexible routing.
Sponsored
Contextive is a powerful, TypeScript-based Model Context Protocol (MCP) server designed to streamline and unify your interactions with various large language models. It provides a lean and token-efficient architecture for tool execution, allowing developers to create concise, shared prompts and define composable workflows. With built-in multi-model routing via the Vercel AI SDK, Contextive enables seamless integration with providers like Claude, ChatGPT, and Cursor, making it an ideal solution for building sophisticated AI agents and applications.
주요 기능
01Spec-driven design with ADRs and tests ensuring architectural integrity
02Lean, token-efficient tools and prompts for reduced overhead
03Multi-provider model routing via Vercel AI SDK for unified access
04Config-first approach with Zod and JSON Schema validation
050 GitHub stars
06Composable workflows to chain tools and models into higher-level functions
사용 사례
01Integrating various large language models (LLMs) into MCP clients like ChatGPT, Claude Desktop, and Cursor
02Developing custom AI agents that require access to token-efficient tools and composable workflows
03Standardizing access to multiple LLM providers through a single, unified interface for development