drio
Open app

Higress AI-Search MCP Server

Source

An MCP server enhances AI responses with real-time search results via Higress ai-search.

Catalog onlyCatalog onlySTDIO

Overview

Higress AI-Search MCP Server is a Model Context Protocol server that enhances AI responses by integrating real-time search results from various search engines through the Higress ai-search feature.

To use the server, configure it with the required environment variables and run it using the provided commands. You can choose between using the uvx command for automatic package installation or the uv command for local development.

  • Internet Search: Access to general web information from Google, Bing, and Quark. - Academic Search: Retrieve scientific papers and research from Arxiv. - Internal Knowledge Search: Search through internal knowledge bases.
  1. Enhancing AI model responses with up-to-date information.
  2. Providing academic research support for AI applications.
  3. Enabling internal knowledge retrieval for organizations.

Add to your AI client

Use these steps to connect Higress AI-Search MCP Server in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "higress-ai-search-mcp-server-cr7258": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-higress-ai-search-mcp-server-cr7258"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "higress-ai-search-mcp-server-cr7258": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-higress-ai-search-mcp-server-cr7258"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "higress-ai-search-mcp-server-cr7258": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-higress-ai-search-mcp-server-cr7258"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "higress-ai-search-mcp-server-cr7258": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-higress-ai-search-mcp-server-cr7258"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "higress-ai-search-mcp-server-cr7258": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-higress-ai-search-mcp-server-cr7258"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "higress-ai-search-mcp-server-cr7258": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-higress-ai-search-mcp-server-cr7258"
      ]
    }
  }
}

FAQ

Can I use this server for any AI model?

Yes! You can configure it to work with various LLM models as per your requirements.

Is there a demo available?

Yes! You can find demo links in the project documentation.

What are the prerequisites for running this server?

You need to install the `uv` package and configure the Higress service with the ai-search and ai-proxy plugins.

Higress AI-Search MCP Server — MCP Registry