소개
This project provides a comprehensive serverless architecture for a Bedrock-powered LLM agent. It uniquely integrates MCP servers, hosted entirely on AWS Lambda, to handle sophisticated memory and context management, allowing the agent to recall relevant past memories or store new information for context-aware responses. Leveraging AWS Bedrock for language understanding and embeddings, S3 Vector tables for persistent semantic memory, and orchestrated with Python CDK, it offers a scalable and cost-effective solution for deploying intelligent AI agents without managing dedicated infrastructure.