MCP Server Gemini is a powerful server that enables Claude Desktop and other MCP-compatible clients to leverage Google's Gemini AI models. It provides seamless integration with various Gemini models including the latest Gemini 2.5 Pro and Gemini 2.5 Flash, supporting advanced capabilities like thinking models, Google Search grounding, JSON mode, and vision support.
npm install -g mcp-server-gemini
git clone https://github.com/gurr-i/mcp-server-gemini-pro.git
cd mcp-server-gemini-pro
npm install
npm run build
You can set up your Google AI Studio API key in two ways:
Using Environment Variables:
export GEMINI_API_KEY="your_api_key_here"
Using a .env File:
echo "GEMINI_API_KEY=your_api_key_here" > .env
You'll need to configure Claude Desktop to use the MCP server:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%\Claude\claude_desktop_config.json
~/.config/Claude/claude_desktop_config.json
For Global Installation:
{
"mcpServers": {
"gemini": {
"command": "mcp-server-gemini",
"env": {
"GEMINI_API_KEY": "your_api_key_here"
}
}
}
}
For Local Installation:
{
"mcpServers": {
"gemini": {
"command": "node",
"args": ["/path/to/mcp-server-gemini-pro/dist/enhanced-stdio-server.js"],
"env": {
"GEMINI_API_KEY": "your_api_key_here"
}
}
}
}
After making changes, restart Claude Desktop completely for them to take effect.
Once configured, you can use Gemini through Claude Desktop with natural language commands:
"Use Gemini to explain quantum computing in simple terms"
"Generate a creative story about AI using Gemini 2.5 Pro"
"Use Gemini with JSON mode to extract key points from this text"
"Use Gemini with grounding to get the latest news about AI"
"Generate a Python function using Gemini's thinking capabilities"
"Analyze this image with Gemini" (attach image)
"What's in this screenshot using Gemini vision?"
"Use Gemini to review this code and suggest improvements"
"Generate comprehensive tests for this function using Gemini"
The server can be configured using these optional environment variables:
# Logging level (default: info)
# Options: error, warn, info, debug
LOG_LEVEL=info
# Enable performance metrics (default: false)
ENABLE_METRICS=false
# Rate limiting configuration
RATE_LIMIT_ENABLED=true # Enable/disable rate limiting (default: true)
RATE_LIMIT_REQUESTS=100 # Max requests per window (default: 100)
RATE_LIMIT_WINDOW=60000 # Time window in ms (default: 60000 = 1 minute)
# Request timeout in milliseconds (default: 30000 = 30 seconds)
REQUEST_TIMEOUT=30000
# Environment mode (default: production)
NODE_ENV=production
Development Environment:
# .env for development
GEMINI_API_KEY=your_api_key_here
NODE_ENV=development
LOG_LEVEL=debug
RATE_LIMIT_ENABLED=false
REQUEST_TIMEOUT=60000
Production Environment:
# .env for production
GEMINI_API_KEY=your_api_key_here
NODE_ENV=production
LOG_LEVEL=warn
RATE_LIMIT_ENABLED=true
RATE_LIMIT_REQUESTS=100
RATE_LIMIT_WINDOW=60000
REQUEST_TIMEOUT=30000
ENABLE_METRICS=true
{
"mcpServers": {
"gemini": {
"command": "mcp-server-gemini",
"env": {
"GEMINI_API_KEY": "your_api_key_here",
"LOG_LEVEL": "info",
"RATE_LIMIT_REQUESTS": "200",
"REQUEST_TIMEOUT": "45000"
}
}
}
}
Tool | Description | Key Features |
---|---|---|
generate_text | Generate text with advanced features | Thinking models, JSON mode, grounding |
analyze_image | Analyze images using vision models | Multi-modal understanding, detailed analysis |
count_tokens | Count tokens for cost estimation | Accurate token counting for all models |
list_models | List all available Gemini models | Real-time model availability and features |
embed_text | Generate text embeddings | High-quality vector representations |
get_help | Get usage help and documentation | Self-documenting with examples |
Model | Context Window | Features | Best For | Speed |
---|---|---|---|---|
gemini-2.5-pro | 2M tokens | Thinking, JSON, Grounding | Complex reasoning, coding | Slower |
gemini-2.5-flash ⭐ | 1M tokens | Thinking, JSON, Grounding | General purpose | Fast |
gemini-2.5-flash-lite | 1M tokens | Thinking, JSON | High-throughput tasks | Fastest |
gemini-2.0-flash | 1M tokens | JSON, Grounding | Standard tasks | Fast |
gemini-2.0-flash-lite | 1M tokens | JSON | Simple tasks | Fastest |
gemini-2.0-pro-experimental | 2M tokens | JSON, Grounding | Experimental features | Medium |
gemini-1.5-pro | 2M tokens | JSON | Legacy support | Medium |
gemini-1.5-flash | 1M tokens | JSON | Legacy support | Fast |
# Check if API key is set
echo $GEMINI_API_KEY
# Verify .env file exists and is readable
cat .env | grep GEMINI_API_KEY
# Check file permissions
ls -la .env
chmod 600 .env
# Test API key manually
curl -H "Content-Type: application/json" \
-d '{"contents":[{"parts":[{"text":"Hello"}]}]}' \
-X POST "https://generativelanguage.googleapis.com/v1beta/models/gemini-pro:generateContent?key=YOUR_API_KEY"
# Verify config file location (macOS)
ls -la ~/Library/Application\ Support/Claude/claude_desktop_config.json
# Validate JSON syntax
cat claude_desktop_config.json | jq .
# Check server installation
which mcp-server-gemini
npm list -g mcp-server-gemini
# Enable debug logging
export LOG_LEVEL=debug
npm start
To add this MCP server to Claude Code, run this command in your terminal:
claude mcp add-json "gemini" '{"command":"mcp-server-gemini","env":{"GEMINI_API_KEY":"your_api_key_here"}}'
See the official Claude Code MCP documentation for more details.
There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json
file so that it is available in all of your projects.
If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json
file.
To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".
When you click that button the ~/.cursor/mcp.json
file will be opened and you can add your server like this:
{
"mcpServers": {
"gemini": {
"command": "mcp-server-gemini",
"env": {
"GEMINI_API_KEY": "your_api_key_here"
}
}
}
}
To add an MCP server to a project you can create a new .cursor/mcp.json
file or add it to the existing one. This will look exactly the same as the global MCP server example above.
Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.
The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.
You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.
To add this MCP server to Claude Desktop:
1. Find your configuration file:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%\Claude\claude_desktop_config.json
~/.config/Claude/claude_desktop_config.json
2. Add this to your configuration file:
{
"mcpServers": {
"gemini": {
"command": "mcp-server-gemini",
"env": {
"GEMINI_API_KEY": "your_api_key_here"
}
}
}
}
3. Restart Claude Desktop for the changes to take effect