drio
Open app

MCP Server: Ollama Deep Researcher

Source

Catalog onlyCatalog onlySTDIO

Overview

MCP Server: Ollama Deep Researcher is a server adaptation of LangChain Ollama Deep Researcher that enables AI assistants to conduct in-depth research on various topics using local language models (LLMs) via the Model Context Protocol (MCP).

To use the server, install the necessary prerequisites, clone the repository, install dependencies, and configure your MCP client to connect to the server. You can then issue research commands to explore topics.

  • Generates web search queries based on user-provided topics. - Gathers and summarizes web search results using APIs like Tavily and Perplexity. - Iteratively improves summaries by identifying knowledge gaps and generating new search queries. - Outputs a final markdown summary with citations from all sources used during research.
  1. Conducting academic research on specific topics.
  2. Gathering information for content creation and writing.
  3. Assisting in data analysis and knowledge discovery.

Add to your AI client

Use these steps to connect MCP Server: Ollama Deep Researcher in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "mcp-server-ollama-deep-researcher-cam10001110101": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-ollama-deep-researcher-cam10001110101"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "mcp-server-ollama-deep-researcher-cam10001110101": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-ollama-deep-researcher-cam10001110101"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "mcp-server-ollama-deep-researcher-cam10001110101": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-ollama-deep-researcher-cam10001110101"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "mcp-server-ollama-deep-researcher-cam10001110101": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-ollama-deep-researcher-cam10001110101"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "mcp-server-ollama-deep-researcher-cam10001110101": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-ollama-deep-researcher-cam10001110101"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "mcp-server-ollama-deep-researcher-cam10001110101": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-server-ollama-deep-researcher-cam10001110101"
      ]
    }
  }
}