Home / MCP / ComfyUI MCP Server

ComfyUI MCP Server

A lightweight Python-based MCP server that interfaces with a local ComfyUI instance to generate images via AI agent requests.

python
Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
    "mcpServers": {
        "comfyui_mcp": {
            "command": "python",
            "args": [
                "server.py"
            ]
        }
    }
}

You can run a lightweight MCP server that talks to a local ComfyUI instance to generate images with AI agent requests. This server speaks MCP over WebSocket and returns image URLs served by ComfyUI, enabling automated workflows and dynamic parameter control.

How to use

Start the MCP server to listen for agent requests. You will typically run it alongside your local ComfyUI instance. Your agents can request images by providing a prompt, size, and model, and the server will route the request to ComfyUI and return an accessible image URL.

How to install

Prerequisites you need to meet before starting:

- Python 3.10 or newer

- ComfyUI installed and running locally (default port 8188)

- Dependencies: requests, websockets, mcp

How to install (commands)

pip install requests websockets mcp

# Start ComfyUI in a separate terminal window
# Ensure ComfyUI is running on port 8188
# Then start the MCP server in another terminal
python server.py

# The MCP server listens for connections on ws://localhost:9000 by default

Notes and practical details

- The MCP server is a local, Python-based service that communicates with ComfyUI via the MCP protocol over WebSocket.

- Ensure your chosen model exists in ComfyUI’s models/checkpoints directory. For example, the server expects to route requests to models available to ComfyUI.

- You can adjust workflows and parameters on the client side. Typical parameters include prompt, width, height, and the workflow to use. The server returns an image URL that you can access through your browser or an integrated viewer.

Configuration and workflow considerations

- Workflows intended for API-style requests should be placed in the workflows/ directory (e.g., basic_api_test.json). Export workflows from ComfyUI in API format to ensure compatibility.

- The server exposes a WebSocket endpoint at ws://localhost:9000. You can connect clients to this endpoint to send MCP requests and receive responses.

Troubleshooting and tips

- If the server cannot reach ComfyUI, verify that ComfyUI is running on the expected port (8188 by default) and that the model you request is present.

- Check that the Python environment has the required packages installed and that you are running the MCP server in an environment with network access to ComfyUI.