Facilitate GPT-5 inference via the OpenAI Responses API through a Model Context Protocol server, offering customizable parameters and optional web search integration.
This server brings the power of GPT-5 inference directly to your Model Context Protocol (MCP) clients. It provides a `gpt5_query` tool that leverages the OpenAI Responses API, allowing for advanced control over query parameters such as verbosity, reasoning effort, model selection, and tool choices. An optional web search preview feature enhances responses with real-time information. Designed for seamless integration with MCP-compatible environments like Claude Code and Claude Desktop, it offers flexible configuration via environment variables and per-call overrides, making it a versatile solution for incorporating advanced AI capabilities into your workflow.