概要
The Prompt Caching skill transforms Claude into a specialized caching architect focused on reducing LLM operational costs and latency. It provides expert guidance on implementing Anthropic's native prompt caching, managing response caches, and utilizing Cache Augmented Generation (CAG) patterns. This skill is essential for developers building production-grade AI applications where token consumption and response times are critical factors, ensuring prompts are structured for maximum prefix reuse and responses are stored efficiently without sacrificing accuracy.