About
This project provides practical demonstrations for building and consuming Model Context Protocol (MCP) servers using Python's FastMCP library. It showcases two primary transport mechanisms: HTTP with a Weather MCP Server integrated with LangChain for dynamic tool discovery by LLMs, and STDIO with a Calculator MCP Server designed for MCP-native hosts. The repository also includes a LangChain client that connects to the HTTP server, illustrating how to convert MCP tools into LangChain's StructuredTool objects for use with OpenAI-compatible models.