AI를 즐겨 사용하는 도구에 연결하기 위한 MCP 서버 전체 컬렉션을 살펴보세요.
Automates Microsoft Office applications via AI models using a COM interface on Windows.
Enables Large Language Models to securely access, analyze, and manipulate data within Firebird SQL databases.
Enables AI assistants to manage and operate GitHub Actions workflows.
Connects AI agents and LLMs with real-time web data, providing battle-tested web scraping, JavaScript rendering, and anti-bot protection.
Converts PDF, Word, and Excel documents into Markdown format, offering a high-performance server solution with both Model Context Protocol (MCP) and RESTful API support.
Retrieves information from the Pinecone Assistant service.
Performs comprehensive domain research, including availability checks, WHOIS lookups, DNS record retrieval, and SSL certificate inspection, all without needing API keys.
Streamlines affiliate marketing by providing comprehensive tools for Taobao, JD, and Pinduoduo platforms, including link conversion, product search, and promotion management.
Provides a minimal Rails API template for building Model Context Protocol (MCP) servers with tool execution capabilities.
Dynamically routes coding tasks between local LLMs, free APIs, and paid APIs to optimize costs.
Executes Python code generated by LLMs in a secure, locally-hosted environment, leveraging Hugging Face's LocalPythonExecutor and MCP for LLM application integration.
Simplifies adding authentication and authorization to Node.js MCP servers.
Manages Linear issues, projects, and other entities via an optimized Model Context Protocol (MCP) server designed for language models.
Transforms feature ideas into production-ready code using a systematic Spec-Driven Development approach.
Empowers AI assistants with comprehensive tools to interact with Contentful APIs via the Model Context Protocol.
Enables AI tools to interact with Gradle projects programmatically via the Model Context Protocol.
Provides real-time stock data and historical analysis through the Yahoo Finance API, designed for seamless integration with MCP.
Establishes a privacy-first, local RAG server for AI coding assistants, enabling semantic search over personal documents without exposing data to external services.
Enables AI CLI tools like Cursor and Claude Code to invoke specialized, custom AI agents for specific tasks.
Stream online music directly from your command-line interface with support for AI integration via Model Context Protocol.
Provides an advanced MCP server for RAG-enabled memory, leveraging a knowledge graph with vector search capabilities.
Enables AI assistants to generate images, text, and audio through the Pollinations APIs using the Model Context Protocol.
Executes tasks using Model Context Protocol (MCP) tools via Anthropic Claude, AWS BedRock, and OpenAI APIs.
Integrates diverse AI models into Claude Desktop, empowering enhanced development and analytical capabilities.
Enables Honeycomb Enterprise customers to query and analyze their observability data, alerts, dashboards, and cross-reference production behavior with codebase using LLMs.
Centralizes and manages AI interactions, conversations, and insights using a local-first, graph-augmented approach for personal context.
Analyzes Instagram engagement metrics to generate leads, extract demographic insights, and analyze audience feedback.
Enables AI agents to debug live programs by bridging Model Context Protocol clients with Debug Adapter Protocol servers.
Enables AI assistants to securely interact with a local file system for reading, writing, and managing files within a designated project directory.
Simplifies the deployment of Model Context Protocol (MCP) servers by providing a purpose-built CI/CD platform.
Scroll for more results...