01WebSocket-based transport for efficient communication
02Local LLM server integration using the `ollama` package
03Supports various LLM models and configurations
04Generic MCP client for resource listing and data streaming
05Configurable server and client via command-line arguments
060 GitHub stars