Descubre nuestra colección curada de servidores MCP para learning & documentation. Explora 1416 servidores y encuentra los MCP perfectos para tus necesidades.
Enables AI assistants to interact with and extract information from a Logseq knowledge graph.
Analyzes React code and generates documentation locally using the Model Context Protocol.
Enables AI users to execute MATLAB code, generate scripts from natural language, and access documentation directly within conversations.
Integrates with The Movie Database (TMDB) API to provide movie information, search capabilities, and recommendations within Claude Desktop.
Connects a local Zotero library with Claude's desktop interface for direct read access.
Enables access to Quran.com's REST API for verse search, translation, and tafsir.
Provides AI coding agents with always-up-to-date, version-specific package documentation as context.
Provides a detailed, iterative walkthrough for implementing authorization in an Model Context Protocol (MCP) server.
Serves development documentation via the MCP protocol, designed for various development frameworks.
Augments AI responses with relevant documentation context through vector search.
Optimizes context for AI coding assistants by enabling them to extract targeted information from files and command outputs, rather than processing large data in its entirety.
Manages markdown documentation with frontmatter, optimized for AI assistant integration.
Provides AI assistants with comprehensive access to Roblox Studio projects for analysis, debugging, and understanding game architecture.
Enables AI assistants to leverage Go's Language Server Protocol for advanced Go code analysis.
Enhances the developer experience by providing specialized tools for UI5 development within agentic AI workflows.
Provides Claude with persistent memory and workspace file access across all chats to eliminate repetitive explanations.
Generates summaries of YouTube videos via the Model Context Protocol.
Enables Large Language Models to explore and understand OpenAPI specifications through specialized tools.
Enables LLMs to effectively utilize new and internal APIs by providing real-time, contextual API documentation.
Manages memory banks, enabling AI assistants to store and retrieve information across sessions for maintaining context.
Scroll for more results...