Vertex AI Search MCP server

Integrates with Google's Vertex AI and Discovery Engine APIs to enable advanced search and retrieval operations on large datasets, supporting semantic search and natural language understanding.
Back to servers
Provider
Ubie
Release date
Feb 18, 2025
Language
Python
Stats
10 stars

This MCP server enables document searching using Vertex AI with Gemini's grounding capabilities. It connects to your Vertex AI datastores, allowing Gemini to deliver high-quality search results grounded in your private data.

Installation

You can set up the MCP server in two ways:

Option 1: Clone the Repository

# Clone the repository
git clone [email protected]:ubie-oss/mcp-vertexai-search.git

# Create a virtual environment
uv venv
# Install the dependencies
uv sync --all-extras

# Check the command
uv run mcp-vertexai-search

Option 2: Install the Python Package

# Install the package
pip install git+https://github.com/ubie-oss/mcp-vertexai-search.git

# Check the command
mcp-vertexai-search --help

Note: When installing via the Python package, you'll need to create a config file based on the template as it's not included in the package.

Usage

Configuring the Server

Create a configuration file based on the template structure below:

server:
  name: "your-server-name"
  
model:
  model_name: "your-model-name"
  project_id: "your-project-id"
  location: "us-central1"
  impersonate_service_account: "your-service-account"
  generate_content_config:
    # Your model configuration

data_stores:
  - project_id: "your-project-id"
    location: "us"
    datastore_id: "your-datastore-id"
    tool_name: "your-tool-name"
    description: "Description of your datastore"

Running the Server

The MCP server supports two transport methods: SSE (Server-Sent Events) and stdio (Standard Input Output).

mcp-vertexai-search serve \
    --config config.yml \
    --transport <stdio|sse>

Testing Search Functionality

You can test the search functionality without starting the MCP server:

mcp-vertexai-search search \
    --config config.yml \
    --query "your search query"

Configuration Details

The configuration file contains these key sections:

Server Configuration

  • server.name: The name of your MCP server

Model Configuration

  • model.model_name: Vertex AI model name
  • model.project_id: Google Cloud project ID
  • model.location: Model location (e.g., us-central1)
  • model.impersonate_service_account: Service account to impersonate
  • model.generate_content_config: Configuration for content generation

Data Store Configuration

  • data_stores: List of Vertex AI data stores to connect to
    • project_id: Google Cloud project ID for the data store
    • location: Data store location (e.g., us)
    • datastore_id: ID of the Vertex AI data store
    • tool_name: Name of the tool
    • description: Description of the data store

Each data store represents a source of documents that can be searched through the MCP server.

How to add this MCP server to Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "cursor-rules-mcp": {
            "command": "npx",
            "args": [
                "-y",
                "cursor-rules-mcp"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later