Multi-LLM Cross-Check MCP server

Provides a unified interface for querying multiple LLM providers simultaneously, enabling side-by-side response comparison for fact-checking, gathering diverse perspectives, or evaluating different models' capabilities.
Back to servers
Setup instructions
Provider
Lior
Release date
Apr 15, 2025
Language
Python
Stats
10 stars

This MCP server allows you to cross-check responses from multiple LLM providers simultaneously, providing a unified interface for querying different AI models through Claude Desktop. It supports OpenAI, Anthropic, Perplexity AI, and Google Gemini, processing requests in parallel for faster responses.

Installation

Automatic Installation via Smithery

The easiest way to install the Multi LLM Cross-Check Server for Claude Desktop is via Smithery:

npx -y @smithery/cli install @lior-ps/multi-llm-cross-check-mcp-server --client claude

Manual Installation

If you prefer to install manually, follow these steps:

  1. Clone the repository:
git clone https://github.com/lior-ps/multi-llm-cross-check-mcp-server.git
cd multi-llm-cross-check-mcp-server
  1. Initialize the UV environment and install dependencies:
uv venv
uv pip install -r requirements.txt

Configuration

Create a file named claude_desktop_config.json in your Claude Desktop configuration directory with the following content:

{
  "mcp_servers": [
    {
      "command": "uv",
      "args": [
        "--directory",
        "/multi-llm-cross-check-mcp-server",
        "run",
        "main.py"
      ],
      "env": {
        "OPENAI_API_KEY": "your_openai_key",
        "ANTHROPIC_API_KEY": "your_anthropic_key",
        "PERPLEXITY_API_KEY": "your_perplexity_key",
        "GEMINI_API_KEY": "your_gemini_key"
      }
    }
  ]
}

Configuration Notes

  • You only need to add API keys for the LLM providers you want to use
  • The server will skip providers without configured API keys
  • You may need to use the full path to the UV executable in the command field
    • Find it by running which uv on macOS/Linux or where uv on Windows

API Keys

Obtain API keys from the following locations:

Usage

After configuration, the server works as follows:

  1. The MCP server automatically starts when you open Claude Desktop
  2. In your conversations, you can use the cross_check tool by asking to "cross check with other LLMs"
  3. Provide your prompt, and the server will return responses from all configured LLM providers

Response Format

The server returns a dictionary containing responses from each configured LLM provider:

{
    "ChatGPT": { ... },
    "Claude": { ... },
    "Perplexity": { ... },
    "Gemini": { ... }
}

Error Handling

The server includes robust error handling:

  • Providers without API keys are automatically skipped
  • API errors are caught and included in the response
  • Each LLM's response is processed independently, so errors with one provider won't affect others

How to install this MCP server

For Claude Code

To add this MCP server to Claude Code, run this command in your terminal:

claude mcp add-json "multi-llm-cross-check" '{"command":"uv","args":["--directory","/multi-llm-cross-check-mcp-server","run","main.py"],"env":{"OPENAI_API_KEY":"your_openai_key","ANTHROPIC_API_KEY":"your_anthropic_key","PERPLEXITY_API_KEY":"your_perplexity_key","GEMINI_API_KEY":"your_gemini_key"}}'

See the official Claude Code MCP documentation for more details.

For Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "multi-llm-cross-check": {
            "command": "uv",
            "args": [
                "--directory",
                "/multi-llm-cross-check-mcp-server",
                "run",
                "main.py"
            ],
            "env": {
                "OPENAI_API_KEY": "your_openai_key",
                "ANTHROPIC_API_KEY": "your_anthropic_key",
                "PERPLEXITY_API_KEY": "your_perplexity_key",
                "GEMINI_API_KEY": "your_gemini_key"
            }
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.

For Claude Desktop

To add this MCP server to Claude Desktop:

1. Find your configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

2. Add this to your configuration file:

{
    "mcpServers": {
        "multi-llm-cross-check": {
            "command": "uv",
            "args": [
                "--directory",
                "/multi-llm-cross-check-mcp-server",
                "run",
                "main.py"
            ],
            "env": {
                "OPENAI_API_KEY": "your_openai_key",
                "ANTHROPIC_API_KEY": "your_anthropic_key",
                "PERPLEXITY_API_KEY": "your_perplexity_key",
                "GEMINI_API_KEY": "your_gemini_key"
            }
        }
    }
}

3. Restart Claude Desktop for the changes to take effect

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later