Explora nuestra colección completa de servidores MCP para conectar la IA con tus herramientas favoritas.
Provides AI assistants with comprehensive access to shadcn/ui v4 components, blocks, demos, and metadata via the Model Context Protocol.
Enables scalable mobile automation through a platform-agnostic interface for interacting with native iOS/Android applications and devices.
Enables interaction with an Obsidian vault via the Obsidian Local REST API plugin.
Provides a lightweight memory system for coding agents through a graph-based issue tracker.
Enhances AI model reasoning by making it recursively evaluate and refine its responses through self-argumentation.
Connects Supabase projects to AI assistants via the Model Context Protocol (MCP).
Converts various file types and web content into Markdown format.
Exposes MCP stdio-based servers over SSE or WebSockets, enabling remote access and integration.
Create and configure development containers from devcontainer.json files, providing isolated development environments.
Automatically generates and comprehensively evaluates presentations from documents using an innovative agentic AI system.
Intelligently routes large language model requests to the most suitable models and tools for optimized inference, enhanced security, and improved accuracy.
Provides a catalog of official Microsoft MCP server implementations for AI-powered data access and tool integration.
Build a persistent semantic graph from conversations with AI assistants using standard Markdown files.
Connects Ableton Live to Claude AI, enabling prompt-assisted music production and session manipulation.
Facilitates communication between MCP clients and servers by bridging different transport protocols like stdio and SSE.
Provides a Gradio web interface for locally running various Llama 2 models on GPU or CPU across different operating systems.
Crawls, extracts, and organizes technical documentation into an LLM-ready format, streamlining research and implementation for developers.
Integrates advanced AI capabilities directly into Zotero, enabling users to chat with PDFs, extract detailed insights, and streamline research workflows.
Automates data integration via a stable, self-healing SDK, providing automated schema-drift detection, retries, and remappings to maintain continuous data flow without connector maintenance or rewrites.
Transforms existing API servers and services into Model Context Protocol (MCP) compliant endpoints with zero code changes.
Streamline LLM Agent development with an all-in-one library offering tools, prompts, frameworks, and models.
Automates software development task management for AI agents, facilitating chain-of-thought reasoning and code consistency.
Enables AI assistants to search and analyze arXiv papers through a simple Model Context Protocol interface.
Empowers AI agents and coding assistants with web crawling and retrieval-augmented generation (RAG) capabilities.
Create custom chatbots with personalized knowledge bases using advanced language models.
Provides a Model Context Protocol (MCP) server for accessing and interacting with Grafana instances and their surrounding ecosystem.
Accelerate the development and research of complex Retrieval-Augmented Generation (RAG) systems with a low-code, modular framework.
Enables protocol-level interaction with a Model Context Provider server, allowing users to send commands, query data, and interact with server resources.
Provides n8n workflow templates for building various AI agents.
Exposes comprehensive browser functions via the Model Context Protocol, enabling external applications and AI models to programmatically interact with a web browser.
Scroll for more results...