Discover 169 MCPs built for AWS.
Manages and streams AI data with a specialized database for vectors, images, texts, and videos, supporting LLM applications and deep learning model training.
Bring AWS best practices directly into development workflows with a suite of specialized Model Context Protocol (MCP) servers.
Automates FinOps processes to reduce cloud costs and streamline financial operations through no-code workflows.
Monitors PostgreSQL databases, finds root causes of issues, and suggests fixes and improvements.
Identifies cloud risks, tests security vulnerabilities, and enhances cloud protection for public cloud tenants.
Streamline the deployment of applications and management of multi-cloud infrastructure for both AI agents and humans.
Define and compose secure Model Context Protocol (MCP) servers to generate and deploy AI workflows and agents with integrated governance.
Streamlines the deployment of Terraform applications to LocalStack for local AWS cloud development and testing.
Enables the use of the AWS Cloud Development Kit (CDK) CLI to deploy applications against local APIs provided by LocalStack.
Enables the creation of reliable agent workflows with advanced reliability features for real-world applications.
Simplifies the creation of serverless Model Context Protocol (MCP) tools using AWS Lambda and Streamable HTTP.
Manage AWS infrastructure using natural language commands through an intelligent AI-powered system.
Enables AI assistants to execute AWS CLI commands in a secure, containerized environment using the Model Context Protocol (MCP).
Connect AWS FinOps capabilities to an AI assistant for cloud cost analysis, waste auditing, and budget insights using natural language.
Explore AWS EC2 Spot instance types, savings, prices, and interruption frequencies for optimal cloud resource selection.
Provides AI assistants access to AWS CloudWatch Logs for analysis, searching, and correlation.
Transforms Docker Compose configurations into secure, scalable cloud deployments across major providers like AWS, GCP, and DigitalOcean.
Provides LLMs with the latest stable package versions for multiple package registries when coding.
Enables Large Language Models to execute AWS Lambda functions as tools via Anthropic's Model Control Protocol (MCP) without code modifications.
Provides example implementations for building agentic AI solutions with AWS, utilizing the Model Context Protocol.
Scroll for more results...