Acerca de
Scampi is an MCP server designed to integrate with Claude Code, enabling it to leverage external Large Language Models (LLMs) for high-volume, token-intensive tasks. This allows Claude Code to offload bulk processing to cheaper local LLM servers (like LM Studio, Ollama, llama.cpp, text-generation-webui) or cloud providers (Z.ai), reserving Claude's context for higher-level judgment and reasoning. Scampi provides a suite of tools for tasks such as indexing codebases for semantic search, generating multiple solution approaches in parallel, and brainstorming diverse ideas, all while managing concurrency and caching for optimal performance.