Descubre nuestra colección curada de servidores MCP para cloud infrastructure. Explora 1210 servidores y encuentra los MCP perfectos para tus necesidades.
Enables Large Language Models (LLMs) to build full-stack solutions by providing access to Aiven services like PostgreSQL, Kafka, ClickHouse, Valkey, and OpenSearch.
Provides a universal control plane for AI agents and platform engineers to manage and interact with Model Context Protocol (MCP) servers.
Bridges AI agents and the Akash Network, enabling AI models to deploy applications and manage deployments.
Enables AI assistants to interact with and manage Sakura Cloud infrastructure via the Model Context Protocol (MCP).
Enables Large Language Models to interact with Nutanix Prism Central via the Model Context Protocol.
Enables fetching and processing of web content via Cloudflare Browser Rendering for use as context in LLMs.
Enables natural language control over cloud infrastructure through an AI-powered interface to the Dokploy platform.
Provides seamless integration with Google Cloud Storage, enabling AI assistants to perform file operations, manage buckets, and interact with GCS resources directly.
Provides an MCP server for querying OCI registries and image references.
Provides read-only access to Kubernetes clusters, designed for seamless integration with AI assistants.
Empowers AI Agents to interact with OpenSearch using a standardized and extensible interface.
Enables agentic tools to manage ArgoCD applications and resources through natural language interactions.
Enables Cursor and Windsurf to manage Supabase databases and execute SQL queries safely.
Enables AI agents to interact with Google BigQuery databases via natural language queries and schema exploration.
Manages aaPanel server instances by providing remote access to system information, website configurations, databases, Docker containers, and email accounts.
Provides AI agents with programmatic access to CodeSandbox SDK operations for managing development environments.
Extracts web content using Cloudflare Browser Rendering for use in LLM context.
Provides read-only access to your Tailscale network through the Model Context Protocol.
Provides a unified interface to interact with Firebase Authentication, Firestore, and Storage services.
Enables natural language querying of Google Earth Engine datasets, task execution, and visualization within chat interfaces.
Scroll for more results...