Provides an all-in-one suite of tools for local and remote Large Language Models, enabling multi-engine web search, secure web page fetching and summarization, internal document reading, academic lookups, and calculations.
Tool4lm is a comprehensive Model Context Protocol (MCP) server built with Node.js and TypeScript, designed to augment the capabilities of local LLMs like LM Studio or Ollama. It offers a powerful set of tools including multi-engine web search, secure web page fetching and summarization, internal document search and reading (supporting various formats including PDF), academic research lookup, and an integrated calculator. This server operates without requiring API keys by default, making it a self-contained solution to enhance your LLM's interaction with diverse information sources and computational tasks.