Integrates a self-hosted Ollama AI with VistA model context for querying patient information.
VistAAI provides a self-hosted AI solution that combines the power of Ollama with VistA's patient data. It leverages an MCP server to facilitate communication between the AI model and a VistA database, allowing users to ask questions about VistA patients and receive relevant information. The setup includes Docker containers for Ollama, a model loader, a VistA instance with fmQL, and the MCP server itself, providing a comprehensive environment for exploring AI-driven healthcare data analysis. A web UI is also included for direct interaction with the Ollama model without the VistA context.