关于
This server implements the Model Context Protocol (MCP) to enable AI assistants, including local LLMs like Ollama, to efficiently browse and search the Rancher Helm catalog. It offers deep metadata access, historical version listing, and crucial server-side filtering of `values.yaml` files. This optimization significantly reduces the data sent to LLMs, making interactions more cost-effective and context-efficient for DevOps AI assistants.