Acerca de
This project offers a complete serverless solution for building intelligent Bedrock-based LLM agents. It integrates MCP servers, hosted on AWS Lambda, to provide robust memory and context management, enabling the agent to intelligently store new information or recall past memories for context-aware responses. The architecture leverages AWS Bedrock for language models and embeddings, S3 Vector tables for persistent semantic memory, and is entirely orchestrated using Python CDK, ensuring a scalable and efficient deployment.