Ollama Gradio
0
Creates a local, privacy-focused LLM agent using Ollama and Gradio, with the Model Context Protocol (MCP) for secure tool calling.
Acerca de
This project provides a demonstration of building a privacy-aware, locally hosted LLM agent. It leverages Ollama for running LLMs on your hardware, the Model Context Protocol (MCP) for safe tool calling, and Gradio for a conversational web UI. The entire system is powered by a local SQLite database and exposed as both an agent and an MCP server, ensuring that all data and processing remain on your machine.
Características Principales
- Uses Ollama to run LLMs locally, ensuring data privacy.
- Employs the Model Context Protocol (MCP) for secure tool calling.
- Includes a Gradio-based chat interface for easy interaction.
- Utilizes a local SQLite database for data storage and management.
- Exposes database tools via FastMCP for LLM access.
- 0 GitHub stars
Casos de Uso
- Querying and updating local databases using natural language.
- Building privacy-focused LLM applications that do not send data to external servers.
- Demonstrating local LLM agent autonomy and privacy.