Demonstrates integrating large language model agents with external tools and APIs using the Model Context Protocol.
Explore a prototype implementation of Model Context Protocol (MCP) servers and adapters, designed to showcase how large language model (LLM) agents can seamlessly integrate with external tools and APIs. This project features LangChain MCP Adapters, offering various example servers (math, weather, PII), alongside a dedicated Python-based Shell Server, providing a comprehensive demonstration of MCP's capabilities for extending LLM functionality and addressing security concerns like PII handling.
主要功能
01Docker Support for easy deployment
02Example Servers (Math, Weather, PII)
03LangChain Integration
04Multiple Server Types (stdio, SSE)
050 GitHub stars
06Secure PII Handling with session tracking
使用案例
01Building custom MCP servers for specific functionalities like math or PII processing
02Demonstrating Model Context Protocol (MCP) integration with LLM agents
03Integrating LangChain with external tools and APIs via MCP