Discover our curated collection of MCP servers for deployment & devops. Browse 2221servers and find the perfect MCPs for your needs.
Integrates Cumulocity IoT platform data with AI agents, enabling AI-powered device management, data analysis, and operational insights.
Provides secure access to Kali Linux web penetration testing tools for AI assistants in a controlled Docker environment.
Automate comprehensive penetration testing with an AI-powered framework leveraging LM Studio for autonomous decision-making.
Integrates AI assistants with Docker infrastructure, enabling large language models to safely and efficiently perform container operations.
Establishes a secure connection between GitHub and VS Code through a Dockerized Model Context Protocol (MCP) server.
Connects to IMAP servers to integrate email functionality with MCP systems.
Establishes a foundational AI kernel to serve as the execution, reasoning, orchestration, and governance core for domestic, enterprise, and sovereign AI systems.
Secures code across the entire Software Development Life Cycle (SDLC) using a combination of static analysis, dynamic analysis, and agentic AI to deliver accurate security insights.
Provides an agent-friendly API for managing and orchestrating AFL++ fuzzing campaigns and security testing workflows.
Manages MinIO object storage services with comprehensive features for bucket, object, and permission control, designed for interaction via AI assistants and programmatic access.
Provides dynamic access to the Meraki Dashboard API, enabling comprehensive network management, security auditing, and compliance checks.
Generates interactive 3D scenes from a single 2D image in less than a second.
Provides a Model Context Protocol server to interact with Atlassian Jira, Confluence, and Bitbucket instances, supporting both Cloud and Data Center deployments.
Establishes a revolutionary AI-driven naming convention that encodes complete file context into filenames, facilitating autonomous AI code generation and human-AI collaboration.
Deploys the MiniMind Large Language Model as an all-in-one Docker solution, offering a web UI, an OpenAI-compatible API, and Model Context Protocol (MCP) support.
Provides an all-in-one Docker deployment for the Fabric AI augmentation framework, offering a web UI, REST API, and MCP server.
Provides AI agents with a streamlined Nushell environment, offering comprehensive system interaction, structured data handling, and background task management through a minimal set of powerful tools.
Enables AI agents to perform in-depth analysis of .NET memory dumps, inspecting heaps, threads, and exceptions using natural language.
Orchestrates enterprise-grade multimodal AI pipelines using FastAPI and OpenAI's GPT-4o, Whisper, and TTS for advanced analytics and real-time processing.
Demonstrates a Model Context Protocol (MCP) server using `stdio` transport.
Scroll for more results...