Proyecto TFG icon

Proyecto TFG

Interconnects Large Language Models (LLMs) with data spaces using the Model Context Protocol (MCP).

About

This project implements a functional and scalable architecture enabling an LLM to interact with a data space via the Model Context Protocol (MCP). It features a secure, modular MCP server mediating all LLM data access, preventing direct database interaction. The system includes an LLM client, a FastAPI-based MCP server connected to a DuckDB data space, and adheres to MCP principles for secure and extensible data interaction. The design emphasizes scalability for future integration with Apache Iceberg or Trino/Presto, advanced LLMs, and a RAG architecture.

Key Features

  • Uses TinyLlama-1.1B-Chat-v1.0 for natural language queries
  • Enriched prompts with contextual information from MCP
  • Adherence to MCP design: LLMs access data only through tools
  • Modular, extensible, and traceable architecture via logs
  • Strict separation of semantic processing (LLM) and data access (MCP)
  • 0 GitHub stars

Use Cases

  • Building a foundation for Retrieval-Augmented Generation (RAG) systems
  • Implementing a modular and extensible data access layer for LLMs
  • Enabling LLMs to query and analyze data in a secure and controlled manner
Craft Better Prompts with AnyPrompt