Tool4lm icon

Tool4lm

Provides an all-in-one suite of tools for local and remote Large Language Models, enabling multi-engine web search, secure web page fetching and summarization, internal document reading, academic lookups, and calculations.

Acerca de

Tool4lm is a comprehensive Model Context Protocol (MCP) server built with Node.js and TypeScript, designed to augment the capabilities of local LLMs like LM Studio or Ollama. It offers a powerful set of tools including multi-engine web search, secure web page fetching and summarization, internal document search and reading (supporting various formats including PDF), academic research lookup, and an integrated calculator. This server operates without requiring API keys by default, making it a self-contained solution to enhance your LLM's interaction with diverse information sources and computational tasks.

Características Principales

  • Multi-Engine Web Search: Find information across sources like SearXNG and DuckDuckGo, with deduplication and snippet extraction.
  • Secure Web Fetch/Read: Safely download web pages with size/time limits and extract readable text for summarization.
  • Document Search/Read: Search and read files (txt, md, html, pdf) within a designated sandbox directory.
  • 1 GitHub stars
  • Academic Search/Get: Look up scholarly articles via arXiv, Crossref, and Wikipedia, and retrieve metadata.
  • Integrated Calculator: Evaluate mathematical expressions with support for precision using mathjs.

Casos de Uso

  • Augmenting LLMs with real-time web search and content summarization capabilities for enhanced information retrieval.
  • Enabling LLMs to search and summarize internal documents or personal knowledge bases within a defined sandbox.
  • Providing LLMs with tools for academic research lookups and precise mathematical calculations for diverse tasks.
Advertisement

Advertisement