Orchestrates a fully serverless Bedrock LLM agent, dynamically managing memory and context through Lambda-hosted MCP servers and S3 Vector stores.
Sponsored
This project offers a complete serverless solution for building intelligent Bedrock-based LLM agents. It integrates MCP servers, hosted on AWS Lambda, to provide robust memory and context management, enabling the agent to intelligently store new information or recall past memories for context-aware responses. The architecture leverages AWS Bedrock for language models and embeddings, S3 Vector tables for persistent semantic memory, and is entirely orchestrated using Python CDK, ensuring a scalable and efficient deployment.
주요 기능
010 GitHub stars
02Lambda-hosted MCP server for memory and context management
03Serverless LLM Agent hosted on AWS Lambda
04Infrastructure orchestrated using Python CDK
05Secure API Gateway endpoints with API key authentication
06S3 Vector Store for persistent semantic memory and embeddings