Provides access to Observe API functionality, OPAL query assistance, and troubleshooting runbooks for LLM models through a Model Context Protocol server.
This experimental Model Context Protocol (MCP) server serves as a secure bridge between technically capable Large Language Models (LLMs), such as Claude Sonnet, and the Observe platform. Unlike traditional chatbots, it directly interfaces with Observe APIs to execute OPAL queries, export worksheet data, manage monitors, and retrieve dataset information without involving LLMs in internal functions, thereby preventing private data leakage. The server further enhances interaction by leveraging Pinecone as a vector database for semantic search across OPAL documentation and troubleshooting runbooks, enabling comprehensive troubleshooting and access to vectorized knowledge for in-depth insights.