Tempo
Enables AI assistants to query and analyze distributed tracing data from Grafana Tempo through the Model Context Protocol.
About
Empower your AI assistants with real-time access to Grafana Tempo's distributed tracing data. This Go-based server implements the Model Context Protocol (MCP), providing AI clients with a `tempo_query` tool to explore traces, identify issues, and gain insights directly from conversational interfaces. It integrates seamlessly with popular AI tools like Claude Desktop, Cursor, and n8n, streamlining AI-driven observability workflows.
Key Features
- Offers flexible configuration for Tempo URL and authentication via environment variables
- Includes Docker support for easy deployment and testing
- 5 GitHub stars
- Go-based implementation of the Model Context Protocol (MCP)
- Supports both standard I/O (stdin/stdout) and HTTP Server-Sent Events (SSE) communication
- Provides a `tempo_query` tool for AI agents to interact with Grafana Tempo
Use Cases
- Integrating Tempo query capabilities into AI-powered development environments like Cursor
- Automating observability workflows by connecting AI agents to Tempo data via tools like n8n
- Enabling AI assistants (e.g., Claude) to execute trace queries and analyze distributed system performance