Persistent AI Memory
Provides persistent, searchable local memory for AI assistants, LLMs, and Copilot within VS Code.
概要
Persistent AI Memory is a comprehensive system designed to provide AI assistants, Large Language Models (LLMs), and tools like GitHub Copilot with robust, persistent local memory. It leverages SQLite for storing conversations, AI memories, schedules, VS Code project context, and MCP tool call logs, enabling semantic search through embeddings. This cross-platform system integrates seamlessly with various AI platforms like LM Studio, Ollama, and Koboldcpp, offering zero-configuration setup and flexible integration methods including file monitoring and an HTTP API.
主な機能
- MCP Integration: Model Context Protocol server with tool call logging and AI self-reflection.
- Vector Search: Semantic search capabilities using embeddings via LM Studio.
- Persistent Memory: SQLite-based storage for conversations, AI memories, schedules, and tool calls.
- 62 GitHub stars
- Multi-Platform Support: Compatible with LM Studio, VS Code, Koboldcpp, and Ollama.
- Real-time Monitoring: File-based conversation capture with watchdog functionality.
ユースケース
- Log and analyze AI tool calls using the Model Context Protocol (MCP).
- Enhance AI assistant and LLM long-term memory and context retention.
- Monitor and store conversation history from diverse AI platforms and chat logs.