Implement LLM observability with Dynatrace on OpenShift AI

来源:Redhat.com

文章摘要

Red Hat and Dynatrace have collaborated to deliver deep observability for large language models (LLMs) deployed on OpenShift AI, aiming to enhance visibility into generative AI applications.

  • The solution leverages Dynatrace's OneAgent and OpenTelemetry to capture and analyze LLM-specific metrics.
  • It provides insights into LLM operational details such as token usage, prompt and completion latency, model costs, and sentiment analysis.
  • The integration enables performance optimization, cost control, and responsible AI practices for LLM workloads.
  • This observability extends across the entire LLM lifecycle, from development to production on OpenShift AI.