developer tools向けの厳選されたMCPサーバーコレクションをご覧ください。18204個のサーバーを閲覧し、ニーズに最適なMCPを見つけましょう。
Enables interaction with Jira instances using natural language commands through a Model Context Protocol (MCP) server.
Integrates modern APIs, AI, and automation to revolutionize server functionality.
Invokes task chains of the iFlytek SparkAgent Platform using MCP Server.
Connects Claude for Desktop to a locally running eGet web scraper, enabling web content scraping directly within Claude conversations.
Enhances the reliability of interactions between Large Language Models (LLMs) and tools, especially MCP servers, by providing a structured, validated, and controllable communication layer.
Manage remote terminals with ease using this simple and configurable server.
Integrate and manage Devici platform data, including users, threat models, and security insights, through a Model Context Protocol (MCP) server for LLMs.
Provides Angular project analysis and refactoring capabilities for Large Language Models.
Extends the Sliver C2 framework with a Python-based command and control server for advanced operations.
Provides a comprehensive suite of Python functions and an MCP server for interacting with Kubernetes clusters, primarily for building intelligent agents.
Bridges Language Server Protocol (LSP) with Model Context Protocol (MCP) to enhance codebase interaction for large language models.
Accelerates HuggingFace model downloads and integrates them into various AI development clients.
Empowers AI development by enabling agents to learn from past experiences, reducing repetitive trial-and-error and optimizing token usage.
Performs unified memory forensics using a multi-tier engine that combines Rust speed with Volatility3 coverage for comprehensive analysis.
Expose local Ollama API capabilities as tools for AI agents, enabling them to utilize local models for chat, completion, and embeddings.
Transforms Obsidian notes into agent-ready context, enabling preview-only action plans and safe repository handoffs.
Provides a lightweight, self-hosted server to give AI coding assistants long-term, shared, and searchable memory.
Establishes a robust governance layer for AI agents, enforcing policies, managing credentials, and auditing interactions with external tools.
Provides precise, content-anchored file editing and robust access control for AI agents operating on file systems.
Orchestrates crash-proof LLM pipelines with disk-based checkpointing, cost-effective free-tier model routing, and guaranteed structured output.
Scroll for more results...