Ollama MCP Client icon

Ollama MCP Client

Createdjonigl

Enables local LLMs to interact with Model Context Protocol (MCP) servers using Ollama, facilitating tool use.

About

This Python-based client allows you to connect to Model Context Protocol (MCP) servers and leverage Ollama for processing queries with tool use capabilities. It supports multiple servers, dynamic model switching, and tool management, all accessible through a rich terminal interface. Adapted from the Model Context Protocol quickstart guide, it offers a user-friendly way to interact with LLMs that support function calling.

Key Features

  • Multi-Server Support
  • Configuration Persistence
  • Dynamic Model Switching
  • Tool Management
  • Rich Terminal Interface
  • 0 GitHub stars

Use Cases

  • Switching between different Ollama models on-the-fly.
  • Enabling local LLMs to use external tools.
  • Interacting with multiple MCP servers simultaneously.
Craft Better Prompts with AnyPrompt