Vertex AI Gemini MCP server

Provides a bridge to Google Cloud's Vertex AI Gemini models with web search grounding, direct knowledge answering, and documentation-based responses through configurable tools and streaming support.
Back to servers
Provider
Shariq Riaz
Release date
Apr 21, 2025
Language
TypeScript
Stats
5 stars

The Vertex AI MCP Server provides a comprehensive suite of tools for interacting with Google Cloud's Vertex AI Gemini models, focusing on coding assistance and general query answering through the Model Context Protocol (MCP).

Prerequisites

  • Node.js (v18+)
  • Bun (npm install -g bun)
  • Google Cloud Project with Billing enabled
  • Vertex AI API enabled in your GCP project
  • Google Cloud Authentication configured in your environment

Installation

  1. Install Dependencies:

    bun install
    
  2. Configure Environment:

    • Create a .env file in the project root (copy from .env.example)
    • Set the required environment variables (especially GOOGLE_CLOUD_PROJECT)
  3. Build the Server:

    bun run build
    

Running the Server

Standalone with NPX

You can run the server directly using npx:

# Ensure required environment variables are set
npx vertex-ai-mcp-server

Or install it globally:

npm install -g vertex-ai-mcp-server
vertex-ai-mcp-server

Running with Cline

Add configuration to your Cline MCP settings file (e.g., .roo/mcp.json):

Option A: Using Node (Direct Path)

{
  "mcpServers": {
    "vertex-ai-mcp-server": {
      "command": "node",
      "args": [
        "/full/path/to/your/vertex-ai-mcp-server/build/index.js"
      ],
      "env": {
        "GOOGLE_CLOUD_PROJECT": "YOUR_GCP_PROJECT_ID",
        "GOOGLE_CLOUD_LOCATION": "us-central1",
        "VERTEX_AI_MODEL_ID": "gemini-2.5-pro-exp-03-25",
        "VERTEX_AI_TEMPERATURE": "0.0",
        "VERTEX_AI_USE_STREAMING": "true",
        "VERTEX_AI_MAX_OUTPUT_TOKENS": "65535",
        "VERTEX_AI_MAX_RETRIES": "3",
        "VERTEX_AI_RETRY_DELAY_MS": "1000"
      },
      "disabled": false,
      "timeout": 3600
    }
  }
}

Option B: Using NPX

{
  "mcpServers": {
    "vertex-ai-mcp-server": {
      "command": "npx",
      "args": [
        "-y",
        "vertex-ai-mcp-server"
      ],
      "env": {
        "GOOGLE_CLOUD_PROJECT": "YOUR_GCP_PROJECT_ID",
        "GOOGLE_CLOUD_LOCATION": "us-central1",
        "VERTEX_AI_MODEL_ID": "gemini-2.5-pro-exp-03-25",
        "VERTEX_AI_TEMPERATURE": "0.0",
        "VERTEX_AI_USE_STREAMING": "true",
        "VERTEX_AI_MAX_OUTPUT_TOKENS": "65535",
        "VERTEX_AI_MAX_RETRIES": "3",
        "VERTEX_AI_RETRY_DELAY_MS": "1000"
      },
      "disabled": false,
      "timeout": 3600
    }
  }
}

Available Tools

Query & Generation Tools

  • answer_query_websearch: Answers queries using Vertex AI with Google Search results
  • answer_query_direct: Answers queries using only Vertex AI's internal knowledge
  • explain_topic_with_docs: Explains software topics using official documentation
  • get_doc_snippets: Provides code snippets from official documentation
  • generate_project_guidelines: Creates structured project guidelines based on technologies

Filesystem Operations

  • read_file_content: Reads file contents
  • read_multiple_files_content: Reads multiple files simultaneously
  • write_file_content: Creates or overwrites files
  • edit_file_content: Makes line-based edits to text files
  • create_directory: Creates directories
  • list_directory_contents: Lists directory contents (non-recursive)
  • get_directory_tree: Gets recursive directory tree as JSON
  • move_file_or_directory: Moves/renames files and directories
  • search_filesystem: Searches for files/directories matching patterns
  • get_filesystem_info: Gets metadata about files or directories

Combined AI + Filesystem Operations

  • save_generate_project_guidelines: Generates and saves project guidelines
  • save_doc_snippet: Finds and saves code snippets from documentation
  • save_topic_explanation: Generates and saves topic explanations
  • save_answer_query_direct: Answers queries and saves results
  • save_answer_query_websearch: Answers web-enhanced queries and saves results

How to add this MCP server to Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "cursor-rules-mcp": {
            "command": "npx",
            "args": [
                "-y",
                "cursor-rules-mcp"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later