Quickstart
Facilitates understanding and experimentation with the Model Context Protocol (MCP) by providing custom server and client implementations integrated with a Large Language Model.
About
This project serves as a quick and practical starting point for exploring the Model Context Protocol (MCP). It features a custom MCP server that exposes example tools like `get_weather`, `get_gold_price`, and `get_bitcoin_price`. An accompanying MCP client connects to this server, retrieves tool definitions, and passes them to an integrated Large Language Model (specifically LLaMA 4 via Groq API). This setup enables the LLM to dynamically invoke these tools based on user prompts, demonstrating an end-to-end tool call architecture using JSON-based schemas.
Key Features
- Demonstrates end-to-end MCP protocol implementation
- Dynamic tool selection and execution by LLM
- 1 GitHub stars
- Integration with Large Language Models (LLaMA 4 via Groq API)
- Custom MCP server implementation exposing example tools
- Custom MCP client for connecting and tool retrieval
Use Cases
- Integrating external tools and APIs with Large Language Models
- Developing custom MCP servers and clients
- Learning and experimenting with the Model Context Protocol (MCP)