Interconnects Large Language Models (LLMs) with data spaces using the Model Context Protocol (MCP).
This project implements a functional and scalable architecture enabling an LLM to interact with a data space via the Model Context Protocol (MCP). It features a secure, modular MCP server mediating all LLM data access, preventing direct database interaction. The system includes an LLM client, a FastAPI-based MCP server connected to a DuckDB data space, and adheres to MCP principles for secure and extensible data interaction. The design emphasizes scalability for future integration with Apache Iceberg or Trino/Presto, advanced LLMs, and a RAG architecture.