Deep-research
MCP Deep Research Server using Gemini creating a Research AI Agent
Overview
Open Deep Research is an AI-powered research assistant that performs iterative, deep research on any topic by combining search engines, web scraping, and Gemini large language models. It is designed to refine research direction over time and provide comprehensive insights.
To use Open Deep Research, you can either integrate it as a Model Context Protocol (MCP) tool for AI agents or run it standalone via the command line interface (CLI). You need to set up the environment variables and install the necessary dependencies before starting the server.
- MCP Integration for seamless AI agent usage - Iterative research capabilities - Intelligent query generation using Gemini LLMs - Configurable depth and breadth parameters for research - Smart follow-up question generation - Comprehensive markdown report generation - Concurrent processing of multiple searches
- Conducting in-depth research on emerging technologies
- Generating detailed reports for academic papers
- Assisting AI agents in gathering information on specific topics
Add to your AI client
Use these steps to connect Deep-research in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.
Cursor
Add this to your .cursor/mcp.json file in your project root, then restart Cursor.
.cursor/mcp.json
{
"mcpServers": {
"deep-research-mcp-server-ssdeanx": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-deep-research-mcp-server-ssdeanx"
]
}
}
}Claude Desktop
Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.
~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)
{
"mcpServers": {
"deep-research-mcp-server-ssdeanx": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-deep-research-mcp-server-ssdeanx"
]
}
}
}Claude Code
Add this to your project's .mcp.json file. Claude Code will detect it automatically.
.mcp.json (project root)
{
"mcpServers": {
"deep-research-mcp-server-ssdeanx": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-deep-research-mcp-server-ssdeanx"
]
}
}
}VS Code (Copilot)
Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.
.vscode/mcp.json
{
"servers": {
"deep-research-mcp-server-ssdeanx": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-deep-research-mcp-server-ssdeanx"
]
}
}
}Windsurf
Add this to your Windsurf MCP config file, then restart Windsurf.
~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"deep-research-mcp-server-ssdeanx": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-deep-research-mcp-server-ssdeanx"
]
}
}
}Cline
Open Cline settings, navigate to MCP Servers, and add this server configuration.
Cline MCP Settings (via UI)
{
"mcpServers": {
"deep-research-mcp-server-ssdeanx": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-deep-research-mcp-server-ssdeanx"
]
}
}
}FAQ
What is the purpose of Open Deep Research?
It aims to provide a simple implementation of a deep research agent that can refine its research direction over time.
What are the requirements to run Open Deep Research?
You need a Node.js environment and API keys for Firecrawl and Gemini.
Can I use Open Deep Research without MCP?
Yes, it can be used standalone via the CLI.