AI Memory icon

AI Memory

29

Provides a production-ready Model Context Protocol (MCP) server for AI agents to store, retrieve, and manage semantic memory and contextual knowledge across sessions.

About

Empower your AI agents with persistent, contextual memory using this Model Context Protocol (MCP) server. Designed for production environments, it enables AI agents to seamlessly store, retrieve, and manage knowledge across sessions. It features a robust stack including TypeScript, PostgreSQL with pgvector for efficient semantic search, and local embedding generation via Transformers.js to eliminate external API costs. With intelligent caching, multi-agent support, memory relationships, and async processing, it provides a performant and scalable solution for building smarter, more context-aware AI applications.

Key Features

  • Advanced features like memory consolidation (clustering) and asynchronous job processing for background tasks
  • 23 GitHub stars
  • Multi-agent support with user context isolation and memory relationships via a graph structure
  • Local embedding generation using Transformers.js (no external API calls)
  • Semantic memory management with PostgreSQL + pgvector for vector similarity search
  • Intelligent two-tier caching with Redis and in-memory fallback for blazing fast performance

Use Cases

  • Enhancing AI chatbot and conversational AI systems with historical context
  • Managing and retrieving semantic knowledge for AI-driven applications
  • Empowering AI agents with persistent, context-aware long-term memory
Advertisement

Advertisement