01Seamless integration with local Ollama models.
02Uses a simple JSON configuration file for setup.
03Offers both stdio and SSE transport options.
04Enables LLMs to execute tools from connected servers.
050 GitHub stars
06Supports multiple MCP-compatible servers.