Discover our curated collection of MCP servers for developer tools. Browse 12007servers and find the perfect MCPs for your needs.
Enables AI agents to interact with terminal applications by representing their state as a structured Terminal State Tree (TST).
Provides a robust Model Context Protocol agent for integrating web search capabilities into AI applications and research workflows.
Empower LLMs to efficiently retrieve context from OpenAPI specifications without polluting their internal context.
Manages and processes m3u8/HLS streaming content via a desktop application and the Model Context Protocol.
Routes Model Context Protocol requests to various downstream services and integrates a RAG-enabled AI agent for development assistance.
Enables secure execution of shell commands with a dynamic approval system and comprehensive audit logging.
Facilitates seamless interaction with Google Sheets by providing a local MCP server interface.
Challenges proposed actions before execution, enforcing clarity and coherence with predefined principles through adversarial questioning and NLI-based reasoning evaluation.
Orchestrates multiple MCP services by positioning Llama Maverick as the central AI brain for intelligent management and request routing.
Demonstrates basic usages of the Model Context Protocol with a Python server and client.
Enables AI assistants to securely interact with the file system for various operations.
Provides specialized tools for validating and retrieving metadata from Markdown files crucial to the Subordinación y Valor research project.
Provides an MCP server integrating intelligent web search, local AI processing, and robust OAuth2 authentication over HTTP.
Provides a Typescript sample implementation for an MCP server, demonstrating tool creation and AI completion request handling.
Provides intelligent analysis for code modernization, browser compatibility, and real-time MDN documentation queries for JavaScript/TypeScript projects.
Retrieves user geolocation information via EdgeOne Pages Functions and integrates it with large language models through the Model Context Protocol (MCP).
Validates AI responses using project context and Gemini, acting as a middleware bridge between Cursor IDE and AI models.
Connects LLMs to Jira Assets data, enabling natural language queries without SQL.
Efficiently manages memory for AI agents, optimizing their performance and enabling long-term information retention.
Organizes and provides searchable access to project documentation stored in local markdown files.
Scroll for more results...