Enables on-premises conversational retrieval augmented generation (RAG) with configurable containers and optional ChatGPT or Claude integration.
Minima provides a flexible, open-source solution for on-premises RAG, offering three operational modes: fully isolated operation with local LLMs, integration with ChatGPT for querying local documents via custom GPTs, and integration with Anthropic Claude. It leverages Docker containers for easy deployment and configuration, supporting a range of embedding models, LLMs (via Ollama), and rerankers. With Minima, users can maintain data security and control while leveraging the power of conversational AI.