Unity Ollama MCP server

Integrates Unity Editor with local LLMs through Ollama, enabling natural language control of game objects, scripts, materials, and scenes for streamlined development workflows.
Back to servers
Provider
ZundamonnoVRChatkaisetu
Release date
Mar 19, 2025
Language
Python
Stats
5 stars

The Unity MCP with Ollama Integration package enables seamless communication between Unity and local Large Language Models (LLMs) via Ollama. This solution allows you to automate Unity workflows, manipulate assets, and control the Unity Editor programmatically using local LLMs without requiring internet connection or API keys.

Prerequisites

Before installing, ensure you have:

  • Unity 2020.3 LTS or newer
  • Python 3.10 or newer
  • Ollama installed on your system
  • The following LLM models pulled in Ollama:
    ollama pull deepseek-r1:14b
    ollama pull gemma3:12b
    

Installation

Step 1: Download and Install Editor Scripts

  1. Download or clone the repository:

    git clone https://github.com/ZundamonnoVRChatkaisetu/unity-mcp-ollama.git
    
  2. Create a folder in your Unity project's Assets directory:

    Assets/UnityMCPOllama
    
  3. Copy the Editor folder from the cloned repository to your Unity project:

    [Repository]/Editor → Assets/UnityMCPOllama/Editor
    
  4. Verify the folder structure is correct:

    Assets/
      UnityMCPOllama/
        Editor/
          MCPEditorWindow.cs
          UnityMCPBridge.cs
    
  5. Let Unity import and compile the scripts.

Step 2: Set Up Python Environment

  1. Create a folder for the Python environment (outside your Unity project):

    mkdir PythonMCP
    cd PythonMCP
    
  2. Copy the Python folder from the cloned repository:

    cp -r [Repository]/Python .
    
  3. Create and activate a virtual environment:

    # Create a virtual environment
    python -m venv venv
    
    # Activate the virtual environment
    # On Windows:
    venv\Scripts\activate
    # On macOS/Linux:
    source venv/bin/activate
    
  4. Install dependencies:

    cd Python
    pip install -e .
    

Step 3: Configure Ollama

  1. Ensure Ollama is installed and running on your system
  2. Pull the supported models:
    ollama pull deepseek-r1:14b
    ollama pull gemma3:12b
    
  3. Start Ollama server:
    ollama serve
    

Usage Guide

Starting the System

  1. Start Unity Bridge:

    • Open your Unity project
    • Navigate to Window > Unity MCP to open the MCP window
    • Click the Start Bridge button
  2. Start Python Server:

    cd PythonMCP
    # Activate virtual environment
    # On Windows:
    venv\Scripts\activate
    # On macOS/Linux:
    source venv/bin/activate
    
    # Start the server
    cd Python
    python server.py
    
  3. Configure Ollama Settings:

    • In the Unity MCP window, locate the Ollama Configuration section
    • Verify or update these settings:
      • Host: localhost (default)
      • Port: 11434 (default)
      • Model: Select either deepseek-r1:14b or gemma3:12b
      • Temperature: Adjust as needed (0.0-1.0)
    • Click Apply Ollama Configuration

Using the Chat Interface

  1. Click the Show Chat Interface button in the Unity MCP window
  2. Type your instructions in the message field
  3. Click Send to process your request

Example Prompts

  • "Create a red cube at position (0, 1, 0)"
  • "Add a sphere to the scene and apply a blue material"
  • "List all objects in the current scene"
  • "Write a simple movement script and attach it to the cube"

Status Indicators

The Unity MCP window provides status information:

  • Python Server Status:

    • Green: Connected
    • Yellow: Connected but with issues
    • Red: Not connected
  • Unity Bridge Status:

    • Running: Unity is listening for connections
    • Stopped: Unity socket server is not active
  • Ollama Status:

    • Connected: Successfully connected to Ollama server
    • Not Connected: Unable to connect to Ollama

Troubleshooting

Common Issues

  1. "Not Connected" Status for Python Server

    • Ensure the Python server is running
    • Check for errors in the Python console
    • Verify the Unity Bridge is running
  2. Cannot find Unity MCP menu

    • Make sure the Editor scripts are properly imported
    • Check the Unity console for any errors
    • Restart Unity if necessary
  3. Ollama Connection Issues

    • Verify Ollama is running with ollama serve
    • Check that models are properly pulled
    • Ensure no firewall is blocking port 11434
  4. MCP Command Execution Fails

    • Check Python console for detailed error messages
    • Verify that the Unity Bridge is running
    • Make sure the prompt is clear and specific

Performance Considerations

Local LLM performance depends on your hardware:

  • For deepseek-r1:14b: Recommended minimum 12GB VRAM
  • For gemma3:12b: Recommended minimum 10GB VRAM
  • CPU-only operation is possible but will be significantly slower

How to add this MCP server to Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "cursor-rules-mcp": {
            "command": "npx",
            "args": [
                "-y",
                "cursor-rules-mcp"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later