Audits and optimizes Claude Code and Codex context usage to recover up to 25% of token capacity and prevent quality decay.
Token Optimizer is a comprehensive diagnostic and remediation tool designed to combat context window bloating in AI-assisted development environments. By deploying a suite of specialized parallel audit agents, it identifies 'ghost tokens,' redundant documentation, and overhead from unused MCP servers or skills. The skill provides a data-driven optimization plan including smart compaction, configuration cleanup, and performance verification, ensuring Claude remains sharp and responsive even in complex projects with tight context limits.
주요 기능
01Automated identification of ghost tokens and duplication in CLAUDE.md and MEMORY.md
02Multi-agent parallel auditing of system configurations, memory files, and MCP servers
03Smart compaction management to preserve critical information while purging redundant data
04Quantifiable context recovery tracking with detailed before-and-after token measurements
05Safety-first implementation featuring automated backups and user-approved diffs
06862 GitHub stars
사용 사례
01Auditing complex MCP and Skill setups to remove overhead from unused tools and commands
02Maintaining long-term project memory by automating smart context compaction and documentation pruning
03Optimizing large codebases where Claude's context window frequently feels full or sluggish