Home / MCP / MCP Ollama MCP Server

MCP Ollama MCP Server

Provides an MCP bridge between Ollama and MCP clients for listing, inspecting, and querying models.

python
Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
    "mcpServers": {
        "ollama": {
            "command": "uvx",
            "args": [
                "mcp-ollama"
            ]
        }
    }
}

You can connect Ollama to Claude Desktop or other MCP clients through a dedicated MCP server. This server exposes tools to list, inspect, and query Ollama models, enabling seamless interaction between Ollama models and your MCP-powered workflows.

How to use

This MCP server provides three main tools you can call from any MCP client that supports the Model Context Protocol: list_models to list all downloaded Ollama models, show_model to retrieve detailed information about a specific model, and ask_model to send a question to a chosen model. Use these tools to discover available models, inspect their capabilities, and interact with models directly from your MCP client.

How to install

Prerequisites you need on your system before running the MCP server:

  • Ollama installed and running (visit https://ollama.com/download)
  • At least one model pulled with Ollama (for example, ollama pull llama2)

Step-by-step setup to get the MCP server up and running:

# 1) Install Python 3.10+ and Ollama
# 2) Pull a model for Ollama (example):
#    ollama pull llama2

# 3) Set up Claude Desktop integration (example path for macOS):
#    Add the following to your Claude Desktop config file
#    (~/Library/Application Support/Claude/claude_desktop_config.json on macOS, %APPDATA%\\Claude\\claude_desktop_config.json on Windows):

{
  "mcpServers": {
    "ollama": {
      "command": "uvx",
      "args": [
        "mcp-ollama"
      ]
    }
  }
}

# 4) Development workflow (example):
#    git clone https://github.com/yourusername/mcp-ollama.git
#    cd mcp-ollama
#    uv sync

# 5) Test the MCP server in development mode (example):
#    mcp dev src/mcp_ollama/server.py
```

Note: The server is designed to run in development mode for testing and integration with MCP clients.

Additional configuration and notes

The MCP server exposes a simple, local integration path. The official configuration snippet you will use with Claude Desktop is shown below. Adapt the paths if you are on a different operating system.

{
  "mcpServers": {
    "ollama": {
      "command": "uvx",
      "args": [
        "mcp-ollama"
      ]
    }
  }
}

Security and usage notes

Keep your MCP server and Ollama environment secure by restricting access to local tooling and using strong credentials where applicable. If you run multiple MCP servers, isolate their network access and ensure you only expose the necessary endpoints to your MCP clients.

Available tools

list_models

Lists all downloaded Ollama models available to the MCP server.

show_model

Retrieves detailed information about a specific Ollama model.

ask_model

Sends a question to a specified Ollama model and returns the answer.