home / mcp / claude team mcp server

Claude Team MCP Server

Helps you coordinate multiple AI models to collaboratively complete tasks with task planning, execution, and history tracking.

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "7836246-claude-team-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "claude-team"
      ],
      "env": {
        "CLAUDE_TEAM_MAIN_KEY": "sk-your-api-key",
        "CLAUDE_TEAM_MAIN_URL": "https://api.openai.com/v1",
        "CLAUDE_TEAM_MAIN_MODEL": "gpt-4o",
        "CLAUDE_TEAM_MODEL1_KEY": "sk-your-model1-key",
        "CLAUDE_TEAM_MODEL1_URL": "https://api.anthropic.com/v1",
        "CLAUDE_TEAM_MODEL1_NAME": "claude-3-sonnet",
        "CLAUDE_TEAM_MAIN_PROVIDER": "openai",
        "CLAUDE_TEAM_MODEL1_PROVIDER": "anthropic"
      }
    }
  }
}

You can run Claude Code, Windsurf, or Cursor with multiple models collaborating to tackle tasks. This MCP server lets you configure a main model plus several worker models, intelligently distributing work and keeping a complete collaboration history for easy review and improvement.

How to use

Interact with the MCP server through your client by sending natural language tasks. The system analyzes the request and delegates parts of the work to specialized models (for example a fast model for quick formatting, a balanced model for typical development tasks, or a powerful model for deep reasoning). You can ask the team to optimize code, design UI, review changes, or trace performance, and the team will coordinate the effort and return a cohesive result.

How to install

Prerequisites: ensure you have Node.js and npm installed on your system.

npm -v
node -v
```

"Claude Team" MCP server can be run via npx to avoid a global install, or you can install it globally for convenience.

Configuration and usage notes

The MCP server supports a main model plus multiple worker models. You configure the server by launching it with a command that brings up the Claude Team MCP, and you specify environment variables for each model. You can define a main model and any number of worker models. If a worker model does not have a dedicated configuration, it inherits the main configuration.

{
  "mcpServers": {
    "claude-team": {
      "command": "npx",
      "args": ["-y", "claude-team"],
      "env": {
        "CLAUDE_TEAM_MAIN_KEY": "sk-your-api-key",
        "CLAUDE_TEAM_MAIN_URL": "https://api.openai.com/v1",
        "CLAUDE_TEAM_MAIN_MODEL": "gpt-4o",
        "CLAUDE_TEAM_MAIN_PROVIDER": "openai",
        
        "CLAUDE_TEAM_MODEL1_NAME": "gpt-3.5-turbo"
      }
    }
  }
}
```

This example shows a basic dual-model setup with a main model and one worker model. You can extend to additional workers by adding MODEL2, MODEL3, etc., each with their own KEY, URL, NAME, and PROVIDER.

Starting and using the server

After configuring the MCP, start Claude Code (or your MCP client) and begin a session. You can issue tasks like asking the team to implement a user login flow or optimize a code snippet. The system will distribute work across the configured models and present a unified result.

Available tools

team_work

Coordinate team-based task execution by automatically creating experts and distributing work across models.

ask_expert

Consult a specific expert role (frontend/backend/qa) to handle a subtask.

code_review

Review and critique code across the collaboration.

fix_bug

Identify and fix bugs found during collaboration.

history_list

List collaboration history for review.

history_get

Get detailed history of a specific collaboration.

history_search

Search through collaboration history for past tasks.

history_context

Fetch recent context to inform ongoing work.

Claude Team MCP Server - 7836246/claude-team-mcp