Facilitates understanding and experimentation with the Model Context Protocol (MCP) by providing custom server and client implementations integrated with a Large Language Model.
This project serves as a quick and practical starting point for exploring the Model Context Protocol (MCP). It features a custom MCP server that exposes example tools like `get_weather`, `get_gold_price`, and `get_bitcoin_price`. An accompanying MCP client connects to this server, retrieves tool definitions, and passes them to an integrated Large Language Model (specifically LLaMA 4 via Groq API). This setup enables the LLM to dynamically invoke these tools based on user prompts, demonstrating an end-to-end tool call architecture using JSON-based schemas.