01100% local operation with Ollama Vision LLMs for zero cloud API cost
02Windows-native desktop control via Win32 API, not a VM or container
03MCP-native integration for instant compatibility with various AI clients
040 GitHub stars
05Comprehensive built-in safety system including action policies and emergency stop
06Vision LLM-powered screen understanding for robust UI interaction