VistAAI icon

VistAAI

Integrates a self-hosted Ollama AI with VistA model context for querying patient information.

About

VistAAI provides a self-hosted AI solution that combines the power of Ollama with VistA's patient data. It leverages an MCP server to facilitate communication between the AI model and a VistA database, allowing users to ask questions about VistA patients and receive relevant information. The setup includes Docker containers for Ollama, a model loader, a VistA instance with fmQL, and the MCP server itself, providing a comprehensive environment for exploring AI-driven healthcare data analysis. A web UI is also included for direct interaction with the Ollama model without the VistA context.

Key Features

  • Integrates Ollama with VistA data using an MCP server
  • Includes a self-hosted Ollama container with a Llama 3.2 model
  • Provides a VistA container with fmQL for data querying
  • Offers a web UI for direct interaction with the Ollama model
  • Demonstrates AI's ability to recognize and refuse to share confidential patient information
  • 0 GitHub stars

Use Cases

  • Querying VistA patient data using natural language
  • Exploring the capabilities of AI in understanding and interacting with healthcare databases
  • Developing applications that require secure and context-aware access to patient information
Craft Better Prompts with AnyPrompt