Memory
Provides persistent knowledge graph-based memory capabilities for Large Language Models.
About
This server implements the Model Context Protocol (MCP) to offer a robust, persistent memory system for Large Language Models (LLMs). It leverages a knowledge graph to enable LLMs to store, retrieve, and reason about information across different conversations and sessions, ensuring continuity and enhanced contextual understanding for AI applications.
Key Features
- Cross-session Memory
- Knowledge Graph Storage
- Full CRUD operations for memory management
- Semantic Search
- Entity Management
- 0 GitHub stars
Use Cases
- Integrating advanced memory capabilities into MCP-compliant AI applications.
- Enabling Large Language Models to maintain context across various conversations and sessions.
- Providing LLMs with a persistent knowledge base for reasoning and information retrieval.