关于
This project wraps native Shogi AI engine binaries, exposing them via an HTTP API bridge for position analysis and as a Model Context Protocol (MCP) server. It specifically aims to integrate Large Language Models (LLMs) with USI-format Shogi AI, allowing LLMs to provide explanations for Shogi moves and serving as a sample program for such advanced AI integrations.
主要功能
- Model Context Protocol (MCP) server implementation
- Integration setup for Claude Desktop
- USI protocol engine wrapper for native Shogi engines
- 0 GitHub stars
- HTTP API Bridge for Shogi position analysis
- Configurable analysis parameters (depth, multipv, threads)
使用案例
- Integrate Large Language Models (LLMs) with Shogi AI for move explanations
- Provide a standardized API interface for Shogi engine capabilities
- Analyze Shogi positions remotely via HTTP requests