Ollama Deep Researcher
CreatedCam10001110101
Performs in-depth research on topics using local LLMs via Ollama, accessible as MCP tools.
About
The Ollama Deep Researcher is a Model Context Protocol (MCP) server adaptation of LangChain Ollama Deep Researcher. It provides deep research capabilities as MCP tools that can be used within the model context protocol ecosystem, allowing AI assistants to perform in-depth research on topics using local LLMs via Ollama. Given a topic, it generates web search queries, gathers and summarizes search results, identifies knowledge gaps, iteratively improves the summary through multiple research cycles, and provides a final markdown summary with all sources used. Research results are stored as MCP resources for persistent access and easy reuse.
Key Features
- Integration with Ollama for local LLM usage
- Comprehensive tracing and monitoring with LangSmith
- Iterative research process with multiple cycles
- Supports Tavily and Perplexity APIs for web search
- Research results stored as MCP resources for persistent access
Use Cases
- Automated in-depth research for AI assistants
- Knowledge gap analysis and iterative information gathering
- Generating comprehensive summaries of research topics with cited sources