Acerca de
LiteLLM provides a standardized Python wrapper that allows developers to call APIs from multiple LLM providers—including Anthropic, OpenAI, Azure, and local servers like llamafile—using a single, consistent OpenAI-compatible format. It eliminates the complexity of provider-specific SDKs by offering unified exception mapping, automatic retry and fallback logic, and built-in cost tracking. This skill is essential for building resilient, provider-agnostic AI applications and seamless local-to-cloud development workflows within Claude Code.