发现我们为 developer tools 精心策划的 MCP 服务器集合。浏览 14629 个服务器,找到满足您需求的完美 MCP。
Integrates brain-computer interface technology with the Model Context Protocol for neural signal processing and AI interaction.
Enables documentation from GitHub repositories or websites to be converted into prompts suitable for Large Language Models (LLMs) through an MCP server.
Enables secure command-line execution with validation for LLM applications.
Manages tasks through the Model Context Protocol (MCP).
Facilitates testing of MCP Server implementations.
Generates a server based on the Twilio Routes OpenAPI specification for Multi-Agent Conversation Protocol interactions.
Demonstrates functionality of the GitHub MCP server through GitHub API integration and multilingual support.
Provides real-time multilingual text and audio translation using AI, supporting over 20 languages through a modular server design.
Enables Large Language Models (LLMs) to directly query live Tableau CRM Analytics data through a Model Context Protocol (MCP) interface.
Provides a Model Context Protocol server for Atlassian Confluence, enabling complete asset downloads and smart collaborative editing.
Provides a minimal MCP tool server with weather and greeting functionalities, integrated with a LangChain agent for dynamic tool discovery.
Enables AI assistants to interact with PostgreSQL databases using natural language queries.
Provides a FastAPI server to manage local files and orchestrate Databricks operations, including code submission, job management, and DLT pipeline creation.
Provides a local server for programmatic control and automation of 7-Zip archive and file system operations.
Implements a basic Model Context Protocol (MCP) server in TypeScript, following official specifications and best practices.
Facilitates AI-powered chat about personal resumes and enables email notifications, ideal for job interviews and demonstrating Model Context Protocol capabilities.
Provides AI assistants with advanced frontend debugging capabilities through specialized tools for web applications.
Access files across multiple code repositories from a single unified Model Context Protocol (MCP) server, simplifying file management for AI assistants and developers.
Connects any OpenAI-compatible LLM API to the Model Context Protocol, enabling robust analysis and evaluation of large language model quality.
Streamlines management of Cloudflare Workers, KV, R2, DNS, and cache purging directly from your AI assistant or IDE.
Scroll for more results...