Discover our curated collection of MCP servers for developer tools. Browse 20484 servers and find the perfect MCPs for your needs.
Connects to Hugging Face Spaces to provide image generation, vision model, text-to-speech, and other capabilities to Claude Desktop with minimal setup.
Enables AI assistants like Claude to search and reference Apple Notes during conversations using Model Context Protocol.
Enables LLM applications to interact with macOS through AppleScript.
Enables Large Language Models to interact with GraphQL APIs through schema introspection and query execution.
Enables AI assistants to control, debug, and analyze Android devices using natural language via the Model Context Protocol.
Enables AI agents to interact with multiple EVM-compatible networks through a unified interface.
Connects to Kubernetes, providing a Model Context Protocol (MCP) server for interacting with Kubernetes resources.
Enables agentic AI workflows for developers using Docker containers and Markdown prompts.
Enables control of Ableton Live via the Model Context Protocol (MCP) using OSC (Open Sound Control).
Enables code searching in a codebase using ast-grep CLI within an MCP environment.
Provides a local, open-source coding assistant experience similar to Claude Code, leveraging the Vercel AI SDK.
Enables tool use and function calling with Anthropic models, facilitating interaction with external resources.
Enables AI coding assistants to access Figma design data and convert it to HTML/CSS code.
Extends AI capabilities by providing a powerful interface for remote control, calculations, email operations, and knowledge search.
Enables AI models to control macOS with mouse, keyboard, screen, and window management.
Enables AI models and coding assistants to semantically search and retrieve Claude Agent Skills using vector embeddings and the Model Context Protocol.
Enhances AI agents and integrated development environments with an open-core, self-improving context compression and code search suite.
Detects Python security vulnerabilities, dead code, and code quality issues with high precision using hybrid static analysis and optional local LLM agents.
Provides a universal memory layer for AI coding agents, facilitating knowledge sharing and workspace synchronization across multiple platforms.
Automate and enhance your AI agent's capabilities with scheduled tasks, persistent memory, and background processes running locally.
Scroll for more results...