home / mcp / ultra mcp mcp server

Ultra MCP MCP Server

100x Your Claude Code, Gemini CLI, Cursor and/or any coding tools with MCP client support

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "realmikechong-ultra-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "ultra-mcp@latest"
      ],
      "env": {
        "XAI_API_KEY": "YOUR_XAI_KEY",
        "XAI_BASE_URL": "https://grok.example.com",
        "AZURE_API_KEY": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
        "AZURE_BASE_URL": "https://YOUR_AZURE_RESOURCE.openai.azure.com",
        "GOOGLE_API_KEY": "AIzaSy...",
        "OPENAI_API_KEY": "YOUR_OPENAI_API_KEY",
        "GOOGLE_BASE_URL": "https://Gemini.example.com",
        "OPENAI_BASE_URL": "https://api.openai.com/v1"
      }
    }
  }
}

Ultra MCP is a unified MCP server that exposes multiple AI models through a single interface, enabling you to orchestrate providers like OpenAI, Google Gemini, Azure OpenAI, and xAI Grok from Claude Code or Cursor with a simple, zero-friction setup.

How to use

You use Ultra MCP by running it as an MCP server and connecting your MCP clients (such as Claude Code or Cursor) to it. Start the server, configure your API keys for the providers you want to use, and then interact with the available tools and prompts through your MCP client. You can manage models, run interactive chats, and access prompts that are discoverable in Claude Code once configured.

How to install

Prerequisites: you need Node.js and npm (or bun) installed on your system.

# Install globally via npm
npm install -g ultra-mcp

# Or run directly with npx
npx -y ultra-mcp config

Configuration

Configure your API keys interactively to enable the various providers. You will be guided through selecting a provider, entering base URLs, and choosing preferred models. The configuration is stored securely on your system and is loaded automatically when the server starts.

npx -y ultra-mcp config

Server execution flow

After you have configured your providers, start Ultra MCP to run the MCP server. You can run it directly via npx or build locally and run the built binary. The server then exposes a single MCP interface to all configured models.

# Run the MCP server directly
npx -y ultra-mcp

# Or build and run the dist output
bun run build
node dist/cli.js

Integration with Claude Code and Cursor

To integrate with Claude Code, add Ultra MCP as an MCP server in your Claude Code settings using the provided command. The recommended approach lets Claude Code discover Ultra MCP prompts and use the configured API keys.

To integrate with Cursor, insert Ultra MCP as an MCP server in your Cursor settings. Ultra MCP will utilize the API keys you configured during the initial setup.

CLI commands overview

Ultra MCP provides a set of commands to manage configuration, run a dashboard, perform health checks, and interact with the MCP server from the command line.

# Interactive configuration
npx -y ultra-mcp config

# Web dashboard
npx -y ultra-mcp dashboard

# Health checks
npx -y ultra-mcp doctor

# Interactive chat with models
npx -y ultra-mcp chat

Additional sections

Security and privacy: Ultra MCP stores configuration locally and emphasizes privacy by keeping data on your machine. Environment variables may be used to set API keys and base URLs for each provider, with the configuration file taking precedence over environment variables when both are present.

Environment variables you may configure include keys and base URLs for OpenAI, Google Gemini, Azure OpenAI, and xAI Grok. Use placeholders like YOUR_OPENAI_KEY or YOUR_BASE_URL where you do not have exact values yet.

Vector embeddings: Ultra MCP supports semantic code search with configurable embedding models per provider. You can tailor embeddingModel to use cost-efficient or high-accuracy options depending on your needs.

Troubleshooting tips: run doctor to verify connections to providers, use the dashboard for real-time statistics and status, and check the generated logs and local Drizzle ORM-backed database for usage data.

Available tools

deep-reasoning

Leverage advanced AI models for complex problem-solving and analysis, enabling deep reasoning and multi-step planning.

investigate

Configurable depth-based investigation with optional Google Search integration to gather insights.

research

Conduct comprehensive research with multiple output formats (summary, detailed, academic).

list-ai-models

View all available AI models and their configuration status.