소개
RLM (Recursive Language Models) addresses the critical issue of Claude Code's context window limit, where important information is lost after using `/compact`. It functions as an MCP server, providing a robust solution for infinite memory by automatically saving conversation snapshots and key insights. Users can store and recall decisions, facts, and full conversation segments, organize context across multiple projects, and leverage smart retention policies to manage their accumulated knowledge. This eliminates the need to repeat information and ensures Claude Code maintains continuity, significantly boosting productivity.