Demo Local LLM
byHokageM
0Demonstrates creating MCP clients and servers in Python and TypeScript, integrating with local LLMs.
About
This demo project illustrates the creation of MCP clients and servers using both Python (with FastAPI) and TypeScript, showcasing integration with a local Large Language Model (LLM). It outlines a workflow where the client requests tools from the server, the LLM intelligently selects a tool based on a given prompt, and the client then executes the chosen tool on the server. The project highlights its versatility by supporting all combinations of Python and TypeScript clients and servers for a comprehensive demonstration.
Key Features
- Multi-language MCP client implementations (Python & TypeScript).
- Local LLM integration via Ollama for tool selection.
- Multi-language MCP server implementations (Python/FastAPI & TypeScript).
- Demonstrates LLM-driven tool selection and execution workflow.
- Ensures LLM returns correctly typed arguments for tool calls.
- 0 GitHub stars
Use Cases
- Showcasing cross-language communication in microservices-like environments.
- Exploring local LLM integration for function calling and tool execution.
- Learning and prototyping MCP client-server architectures.