Creates a local, privacy-focused LLM agent using Ollama and Gradio, with the Model Context Protocol (MCP) for secure tool calling.
This project provides a demonstration of building a privacy-aware, locally hosted LLM agent. It leverages Ollama for running LLMs on your hardware, the Model Context Protocol (MCP) for safe tool calling, and Gradio for a conversational web UI. The entire system is powered by a local SQLite database and exposed as both an agent and an MCP server, ensuring that all data and processing remain on your machine.