Monitors and reports token savings and context window optimization metrics for AI coding sessions.
The Context Mode Stats skill provides deep visibility into the efficiency of your AI coding sessions by tracking exactly how much context window space is being saved through sandboxing. By reducing tool output volume, this skill allows users to view real-time token consumption, savings ratios, and a detailed breakdown of performance on a per-tool basis. It serves as a crucial monitoring utility for developers looking to maximize their token budget and maintain high model performance over long-running sessions without hitting context limits.
Key Features
017,311 GitHub stars
02Displays real-time token consumption statistics
03Offers a read-only view to ensure monitoring doesn't alter session state
04Provides a granular per-tool breakdown of data usage
05Visualizes efficiency gains from tool output sandboxing
06Calculates the context savings ratio and percentage reduction
Use Cases
01Verifying the efficiency of context-mode sandboxing across different tools
02Auditing AI token usage and costs during complex development tasks
03Identifying high-volume tools that consume the most context window space