Enables AI assistants to remotely manage and control LM Studio models.
Sponsored
This tool functions as an MCP (Model Context Protocol) server, providing AI assistants with a standardized interface to control and interact with LM Studio models. It facilitates remote model management by allowing clients to list available models, load specific models into memory with configurable parameters, unload models when no longer needed, and retrieve detailed information about both downloaded and loaded LLMs.
Key Features
01List all downloaded LLM models
02Load models into memory with configurable parameters
03Unload specific model instances from memory
04Verify connectivity to LM Studio
051 GitHub stars
06List currently loaded models in memory
Use Cases
01Programmatically load and unload LLM models for various tasks
02Provide a remote API for AI clients to interact with local LM Studio instances
03Integrate LM Studio model management into AI assistant applications