drio
Open app

MCP Server for Vertex AI Search

Source

A MCP server for Vertex AI Search

Catalog onlyCatalog onlySTDIO

Overview

MCP Server for Vertex AI Search is a server solution that enables users to search documents using Vertex AI, leveraging Gemini's capabilities to ground responses in private data stored in Vertex AI Datastore.

To use the MCP server, set up your local environment by installing the necessary prerequisites, configure the server using a YAML file, and run the server with the appropriate transport settings. You can also test the search functionality without running the server.

  • Integration with Vertex AI for document search - Grounding of search results in private data - Support for multiple Vertex AI data stores - Configurable server settings via YAML
  1. Searching through private documents using AI capabilities.
  2. Enhancing search results by grounding them in specific datasets.
  3. Integrating multiple data stores for comprehensive search functionality.

Add to your AI client

Use these steps to connect MCP Server for Vertex AI Search in Cursor, Claude, VS Code, and other MCP-compatible apps. The same JSON appears in the Use with menu above for one-click copy.

Cursor

Add this to your .cursor/mcp.json file in your project root, then restart Cursor.

.cursor/mcp.json

{
  "mcpServers": {
    "mcp-vertexai-search-ubie-oss": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-vertexai-search-ubie-oss"
      ]
    }
  }
}

Claude Desktop

Add this server entry to the mcpServers object in your Claude Desktop config, then restart the app.

~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

{
  "mcpServers": {
    "mcp-vertexai-search-ubie-oss": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-vertexai-search-ubie-oss"
      ]
    }
  }
}

Claude Code

Add this to your project's .mcp.json file. Claude Code will detect it automatically.

.mcp.json (project root)

{
  "mcpServers": {
    "mcp-vertexai-search-ubie-oss": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-vertexai-search-ubie-oss"
      ]
    }
  }
}

VS Code (Copilot)

Add this to your .vscode/mcp.json file. Requires the GitHub Copilot extension with MCP support enabled.

.vscode/mcp.json

{
  "servers": {
    "mcp-vertexai-search-ubie-oss": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-vertexai-search-ubie-oss"
      ]
    }
  }
}

Windsurf

Add this to your Windsurf MCP config file, then restart Windsurf.

~/.codeium/windsurf/mcp_config.json

{
  "mcpServers": {
    "mcp-vertexai-search-ubie-oss": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-vertexai-search-ubie-oss"
      ]
    }
  }
}

Cline

Open Cline settings, navigate to MCP Servers, and add this server configuration.

Cline MCP Settings (via UI)

{
  "mcpServers": {
    "mcp-vertexai-search-ubie-oss": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-mcp-vertexai-search-ubie-oss"
      ]
    }
  }
}

FAQ

What is grounding in Vertex AI?

Grounding refers to the process of improving the quality of search results by linking AI responses to specific data stored in a data store.

How do I set up the local environment?

You can set up the local environment by following the installation instructions provided in the documentation, including creating a virtual environment and installing required packages.

Can I use multiple data stores?

Yes, the MCP server supports integration with one or multiple Vertex AI data stores.7:["$","div",null,{"className":"container mx-auto flex flex-col gap-4","children":["$L26","$L27",["$","$L28",null,{"currentProject":{"id":1268,"uuid":"1b33f6a4-5f27-4bf5-a804-4943a95a805b","name":"mcp-vertexai-search","title":"MCP Server for Vertex AI Search","description":"A MCP server for Vertex AI Search","avatar_url":"https://avatars.githubusercontent.com/u/94662526?v=4","created_at":"2025-02-20T14:23:48.022Z","updated_at":"2025-02-23T07:21:17.563Z","status":"created","author_name":"ubie-oss","author_avatar_url":"https://avatars.githubusercontent.com/u/94662526?v=4","tags":"[]","category":"research-and-data","is_featured":false,"sort":1,"url":"https://github.com/ubie-oss/mcp-vertexai-search","target":"_self","content":"$29","summary":"$2a","img_url":"https://github.com/ubie-oss/mcp-vertexai-search/raw/main/docs/img/archirecture.png","type":null,"metadata":"{\"star\":\"18\",\"license\":\"Apache-2.0 license\",\"language\":\"Python\",\"is_official\":false,\"latest_commit_time\":\"2025-06-04 15:27:04\"}","user_uuid":null,"tools":null,"sse_url":null,"sse_provider":null,"sse_params":null,"is_official":false,"server_command":null,"server_params":null,"server_config":null,"allow_call":false,"is_innovation":false,"is_dxt":false,"dxt_manifest":null,"dxt_file_url":null,"is_audit":false},"randomProjects":[],"currentServerKey":"$undefined"}]]}]