LLMs
bydruellan
0Exposes project-specific `llms.txt` files as resources for AI context enhancement via the Model Context Protocol.
Acerca de
llms-mcp serves as a Model Context Protocol (MCP) server, designed to enhance AI interactions by exposing your project's `llms.txt` file as a consumable resource. It automatically detects and parses `llms.txt` in your project root, making its content, along with any referenced local files or external URLs, accessible to MCP-compatible AI clients. This allows AI models to gain relevant context directly from your project's defined resources, streamlining development and improving AI-assisted workflows.
Características Principales
- Detects and exposes `llms.txt` from project root
- Exposes `llms.txt` content via `file://` URI for direct access
- Parses `llms.txt` to extract local file references and external URLs
- Automatically exposes referenced local files as additional MCP resources
- Can fetch external URLs on-demand when accessed by clients
- 0 GitHub stars
Casos de Uso
- Provide project-specific context and documentation to AI models for enhanced understanding
- Integrate seamlessly with MCP-compatible AI clients like Claude Desktop for dynamic context provisioning
- Streamline AI-assisted development by making relevant project files and external resources accessible to AI