About
This Model Context Protocol (MCP) server leverages the OpenAI Responses API as its core inference engine, designed for seamless integration with MCP clients like Claude Code and OpenAI Codex. It distinguishes itself by always permitting `web_search`, allowing the underlying model to autonomously decide when and how to perform searches. The server provides structured outputs, including the main response body, a flag indicating `used_search`, and citations that specify information sources (URLs or source IDs). With a flexible configuration system and robust debugging options, it acts as a powerful backend for AI agents requiring advanced reasoning and up-to-date information retrieval.