Acerca de
This experimental desktop application provides a modular hosting environment for Model Context Protocol (MCP) servers, demonstrating a local-first AI architecture. It leverages a Tauri frontend (Rust + React) for the user interface and a Python sidecar for AI orchestration, effectively decoupling the AI interface from tool execution. Designed as a technical proof-of-concept, it enables users to experiment with integrating local large language models like Ollama with various open-source tools, recreating a 'Claude Desktop' style experience entirely within a private, local setting.