文章摘要
New Relic has announced support for the Model Context Protocol (MCP), aiming to provide true end-to-end observability for AI applications.
- The integration allows developers to trace user requests through large language models (LLMs) and their various integrations, offering unprecedented visibility into the AI application stack.
- New Relic's platform will now ingest and process MCP data, enabling comprehensive monitoring of AI workflows from user input to LLM response and tool usage.
- This support is crucial for debugging, performance optimization, and understanding the behavior of complex AI systems, especially those using tools and RAG.
- It helps address the 'black box' challenge of LLMs by providing transparent insights into their operations and interactions within an application.