Safe Local Python Executor
Executes Python code generated by LLMs in a secure, locally-hosted environment, leveraging Hugging Face's LocalPythonExecutor and MCP for LLM application integration.
About
This tool provides a secure way to execute Python code generated by Large Language Models (LLMs) locally, addressing the need for a safe and easily set-up Python runtime. Wrapping Hugging Face's `LocalPythonExecutor` within an MCP (Model Context Protocol) server, it allows LLM applications, such as Claude Desktop or Cursor, to access a restricted Python environment without the complexities of Docker or virtual machines. It offers a balance between ease of use and security, limiting file I/O operations and restricting importable modules to mitigate potential risks associated with executing untrusted code.
Key Features
- Limits available Python imports to a predefined safe list.
- 6 GitHub stars
- Runs in a `uv`-managed Python virtual environment.
- Offers safer Python code execution than direct `eval()` calls.
- Restricts file I/O operations for enhanced security.
- Exposes a `run_python` tool via MCP.
Use Cases
- Providing a secure Python runtime for AI-powered coding assistants.
- Adding a code interpreter to LLM applications like Claude Desktop.
- Executing LLM-generated Python code in a controlled environment.