Implements a streamable HTTP server that enables intelligent tool usage via Azure OpenAI GPT-4o with the Model Context Protocol.
This project provides a robust and streamable HTTP server built with FastAPI, specifically designed to implement the Model Context Protocol (MCP). It seamlessly integrates with Azure OpenAI's GPT-4o model, enabling advanced tool-calling capabilities. The server comes with simple, ready-to-use tools like Calculator, Weather, and Time, and boasts an extensible architecture for easy addition of custom tools. With Docker support for straightforward deployment and Server-Sent Events (SSE) for real-time communication, it serves as an efficient backend for AI applications requiring external functionality.