Bridges Shogi AI engines with HTTP API and Model Context Protocol for integration with AI agents.
This project wraps native Shogi AI engine binaries, exposing them via an HTTP API bridge for position analysis and as a Model Context Protocol (MCP) server. It specifically aims to integrate Large Language Models (LLMs) with USI-format Shogi AI, allowing LLMs to provide explanations for Shogi moves and serving as a sample program for such advanced AI integrations.