This MCP server enables document searching using Vertex AI with Gemini's grounding capabilities. It connects to your Vertex AI datastores, allowing Gemini to deliver high-quality search results grounded in your private data.
You can set up the MCP server in two ways:
# Clone the repository
git clone [email protected]:ubie-oss/mcp-vertexai-search.git
# Create a virtual environment
uv venv
# Install the dependencies
uv sync --all-extras
# Check the command
uv run mcp-vertexai-search
# Install the package
pip install git+https://github.com/ubie-oss/mcp-vertexai-search.git
# Check the command
mcp-vertexai-search --help
Note: When installing via the Python package, you'll need to create a config file based on the template as it's not included in the package.
Create a configuration file based on the template structure below:
server:
name: "your-server-name"
model:
model_name: "your-model-name"
project_id: "your-project-id"
location: "us-central1"
impersonate_service_account: "your-service-account"
generate_content_config:
# Your model configuration
data_stores:
- project_id: "your-project-id"
location: "us"
datastore_id: "your-datastore-id"
tool_name: "your-tool-name"
description: "Description of your datastore"
The MCP server supports two transport methods: SSE (Server-Sent Events) and stdio (Standard Input Output).
mcp-vertexai-search serve \
--config config.yml \
--transport <stdio|sse>
You can test the search functionality without starting the MCP server:
mcp-vertexai-search search \
--config config.yml \
--query "your search query"
The configuration file contains these key sections:
server.name
: The name of your MCP servermodel.model_name
: Vertex AI model namemodel.project_id
: Google Cloud project IDmodel.location
: Model location (e.g., us-central1)model.impersonate_service_account
: Service account to impersonatemodel.generate_content_config
: Configuration for content generationdata_stores
: List of Vertex AI data stores to connect to
project_id
: Google Cloud project ID for the data storelocation
: Data store location (e.g., us)datastore_id
: ID of the Vertex AI data storetool_name
: Name of the tooldescription
: Description of the data storeEach data store represents a source of documents that can be searched through the MCP server.
There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json
file so that it is available in all of your projects.
If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json
file.
To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".
When you click that button the ~/.cursor/mcp.json
file will be opened and you can add your server like this:
{
"mcpServers": {
"cursor-rules-mcp": {
"command": "npx",
"args": [
"-y",
"cursor-rules-mcp"
]
}
}
}
To add an MCP server to a project you can create a new .cursor/mcp.json
file or add it to the existing one. This will look exactly the same as the global MCP server example above.
Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.
The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.
You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.