01Provides a `local_chat` MCP tool for chat completions
02Supports `local.chat` as a direct request handler for custom MCP methods
03Configurable base URL for local LM Studio/Ollama-compatible APIs
04Allows setting a default model for calls where none is specified
05Enables self-reflection and same-model sub-calls with controlled budgets
060 GitHub stars