home / mcp / vertexai vertex mcp server

VertexAI Vertex MCP Server

A MCP server for Vertex AI Search

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "ubie-oss-mcp-vertexai-search": {
      "command": "uv",
      "args": [
        "run",
        "mcp-vertexai-search",
        "serve",
        "--config",
        "config.yml",
        "--transport",
        "stdio"
      ]
    }
  }
}

You run an MCP server that performs document search using Vertex AI and Gemini grounding. It integrates private Vertex AI data stores to produce grounded, relevant results and can be operated locally or via a running MCP client through standard I/O or Server-Sent Events transport.

How to use

You can use this MCP server in two ways: run it locally as an MCP process and connect with an MCP client, or test the Vertex AI search flow directly without starting the server.

How to install

Prerequisites you need before installation include the MCP runtime tool, which is referred to as uv in this context, and access to Vertex AI data stores.

Option A: Clone the repository and run locally

# Clone the repository
git clone [email protected]:ubie-oss/mcp-vertexai-search.git

# Create a virtual environment
uv venv

# Install the dependencies
uv sync --all-extras

# Check the command
uv run mcp-vertexai-search

Option B: Install the python package directly from the repository

# Install the package
pip install git+https://github.com/ubie-oss/mcp-vertexai-search.git

# Check the command
mcp-vertexai-search --help

Additional setup notes

Set up and run the MCP server with a configuration file to fit your needs. The server supports two transports for communication: standard input/output (stdio) and Server-Sent Events (sse). The transport is chosen when starting the server.

Run the MCP server with a configuration file and select the transport you want.

Appendix A: Config file

The configuration file defines the MCP server name, Vertex AI model parameters, and the Vertex AI data stores you want to query. Use the template as a guide and fill in your specific values.

server:
  name: your_vertexai_mcp_server

model:
  model_name: your_vertexai_model
  project_id: your_gcp_project_id
  location: us-central1
  impersonate_service_account: false
  generate_content_config: {}

data_stores:
  - project_id: your_gcp_project_id
    location: us
    datastore_id: your_datastore_id
    tool_name: your_tool_name
    description: Your Vertex AI datastore description

Available tools

serve

Runs the MCP server to listen for client connections and transport requests using the selected transport method (stdio or sse) with the provided configuration.

search

Tests the Vertex AI search flow by querying the configured data stores and returning results without starting the MCP server.