Llama Streamlit
CreatedNikunj2003
Create an interactive AI assistant with Streamlit, NVIDIA NIM/Ollama, and Model Control Protocol (MCP).
About
Llama Streamlit provides a conversational interface for interacting with Large Language Models (LLMs), enabling real-time external tool execution via Model Control Protocol (MCP). This project supports custom model selection (NVIDIA NIM / Ollama), API configuration, and tool integration, all within a user-friendly Streamlit chat interface. It's designed to enhance usability and real-time data processing for a seamless AI assistant experience.
Key Features
- Support for multiple LLM backends (NVIDIA NIM & Ollama)
- Docker support for easy deployment
- Streamlit UI with interactive chat elements
- LLM-powered chat interface
- 17 GitHub stars
- Real-time tool execution via MCP
Use Cases
- Deploying LLM-powered applications with Docker
- Integrating external tools with LLMs for enhanced functionality
- Building interactive AI assistants with a user-friendly chat interface