Web-LLM
Provides an MCP server to programmatically interact with Web-LLM models running in a browser environment.
About
This tool serves as an MCP (Multi-Component Platform) server designed to facilitate interactions with @mlc-ai/web-llm. It achieves this by launching a headless Chromium browser instance using Playwright, loading a dedicated HTML page that hosts the Web-LLM interface, and then exposing its functionalities. This setup allows for seamless, automated operations like text generation, chat, model management, and status checks directly within a browser context, making local LLM inference accessible via an API.
Key Features
- Browser-based LLM inference via @mlc-ai/web-llm
- Tools for text generation, chat, status checks, and screenshots
- Automated browser interactions using Playwright
- Dynamic management and switching of various Web-LLM models
- 1 GitHub stars
- Self-contained HTML interface for Web-LLM implementation
Use Cases
- Automating large language model (LLM) text generation tasks programmatically.
- Integrating interactive chat functionalities powered by local LLMs into applications.
- Managing and switching between different Web-LLM models via an API for testing or dynamic usage.