Ollama icon

Ollama

64

Integrates Ollama's local large language models into Model Context Protocol-powered applications.

概要

Ollama bridges the gap between local large language models and the Model Context Protocol (MCP), enabling seamless integration of Ollama's capabilities into your applications. It offers a complete Ollama integration, including full API coverage and an OpenAI-compatible chat completion API, allowing you to run AI models locally with full control and privacy. Manage models, execute them with customizable prompts, and control the Ollama server directly through the MCP interface.

主な機能

  • Complete Ollama API integration
  • Customizable model execution parameters (temperature, timeout)
  • Model management (pull, push, list, create, copy, remove)
  • 29 GitHub stars
  • OpenAI-compatible chat completion API
  • Server control (start, manage, view model info)

ユースケース

  • Using an OpenAI-compatible chat completion API with local models
  • Running local LLMs within MCP-powered applications
  • Creating custom models from Modelfiles within the MCP environment
Ollama MCP Server: Integrate Local LLMs Seamlessly