Acerca de
The AI Core skill provides a comprehensive framework for orchestrating the Ahling Command Center's AI layer. It enables developers to implement a robust, local AI pipeline using LiteLLM for model routing, vLLM and Ollama for high-performance inference, Qdrant for vector search, and LangFuse for end-to-end observability. By providing standardized Docker Compose configurations and implementation patterns, this skill simplifies the complexity of managing multiple AI services, ensuring a scalable and observable environment for building LLM-powered applications and automated workflows.