Ollama icon

Ollama

Enables Claude to run local Ollama models asynchronously and retrieve the outputs later.

About

The Ollama MCP server allows users to integrate local Ollama language models with Claude, facilitating asynchronous execution of prompts and scripts. It offers tools for model management, script handling, and job control, allowing Claude to execute bash commands and multi-step workflows. With saved outputs and simple configuration, this server streamlines the use of Ollama models within Claude Desktop, enhancing local AI capabilities.

Key Features

  • Provides tools for model, script, and job management.
  • Executes bash commands and multi-step workflows.
  • Runs Ollama models asynchronously.
  • 1 GitHub stars
  • Saves and manages script templates with variable substitution.
  • Offers simple configuration for Claude Desktop integration.

Use Cases

  • Managing and executing complex workflows involving bash commands and script templates.
  • Integrating local LLMs with fast-agent for multi-agent workflows and tool calling.
  • Running Ollama models within Claude Desktop for asynchronous prompt execution.
Craft Better Prompts with AnyPrompt
Sponsored