AI Tooling Server
bytevinric
0Implements a streamable HTTP server that enables intelligent tool usage via Azure OpenAI GPT-4o with the Model Context Protocol.
概要
This project provides a robust and streamable HTTP server built with FastAPI, specifically designed to implement the Model Context Protocol (MCP). It seamlessly integrates with Azure OpenAI's GPT-4o model, enabling advanced tool-calling capabilities. The server comes with simple, ready-to-use tools like Calculator, Weather, and Time, and boasts an extensible architecture for easy addition of custom tools. With Docker support for straightforward deployment and Server-Sent Events (SSE) for real-time communication, it serves as an efficient backend for AI applications requiring external functionality.
主な機能
- Streamable HTTP server with Server-Sent Events (SSE) support
- Integration with Azure OpenAI GPT-4o for intelligent tool calling
- Includes built-in tools (Calculator, Weather, Time)
- Containerized deployment using Docker and Docker Compose
- Extensible architecture for adding new custom tools
- 0 GitHub stars
ユースケース
- Providing a backend for AI agents or applications that require external tool access
- Automating complex workflows by chaining multiple tool calls orchestrated by an AI model
- Facilitating real-time interaction between AI models and external services (e.g., calculations, data retrieval)