发现我们为 learning & documentation 精心策划的 MCP 服务器集合。浏览 1820 个服务器,找到满足您需求的完美 MCP。
Enables LLMs to interact with Logseq graphs, create pages, manage blocks, and organize information programmatically.
Record and manage notes with AI models using a simple note-taking server.
Provides comprehensive access to academic paper data, author information, and citation networks via the Semantic Scholar API.
Provides AI tools with access to Rust documentation from docs.rs via the Model Context Protocol.
Transforms unreadable documentation into an intelligent, searchable knowledge base.
Provides access to Jewish texts from Sefaria.org's library for Large Language Models via the Model Context Protocol.
Searches and retrieves Go package documentation from pkg.go.dev for use as an MCP server.
Crawls websites, generates Markdown documentation, and makes that documentation searchable.
Demonstrates implementation of an MCP client using SwiftUI and the Anthropic API.
Provides a curated list of resources related to Model Context Protocol (MCP) servers.
Enables AI assistants to access and analyze your Cursor chat history for personalized coding assistance and insights.
Integrates the Capacities knowledge management system with AI models via the Model Context Protocol for seamless interaction.
Enables AI agents to assist with Emacs Lisp coding tasks by providing a structured API for interacting with Emacs and manipulating Elisp code.
Equips AI development and QA workflows with comprehensive documentation and up-to-date code examples for AntV visualization libraries.
Connects AI tools and Large Language Models (LLMs) to the latest Astro documentation via the Model Context Protocol (MCP).
Enables AI agents and automation platforms to directly query NotebookLM for grounded, citation-backed answers from personal knowledge bases.
Provides a token-efficient local server to give AI assistants access to DaisyUI component documentation.
Performs local semantic search on markdown files, delivering relevant documentation chunks to Claude Code for efficient, token-saving retrieval-augmented generation.
Provides a CLI-first interface for inspecting Composer-based Kirby CMS projects and interacting with a live Kirby runtime.
Provides AI full read-write access to a Logseq knowledge graph, enabling comprehensive traversal, analysis, and content management.
Scroll for more results...