Create an interactive AI assistant with Streamlit, NVIDIA NIM/Ollama, and Model Control Protocol (MCP).
Llama Streamlit provides a conversational interface for interacting with Large Language Models (LLMs), enabling real-time external tool execution via Model Control Protocol (MCP). This project supports custom model selection (NVIDIA NIM / Ollama), API configuration, and tool integration, all within a user-friendly Streamlit chat interface. It's designed to enhance usability and real-time data processing for a seamless AI assistant experience.