Ollama MCP server

Integrates with Ollama for local large language model inference, enabling text generation and model management without relying on cloud APIs.
Back to servers
Provider
Matt Green
Release date
Feb 05, 2025
Language
Python
Package
Stats
5.5K downloads
18 stars

This MCP server provides integration between Ollama and Claude Desktop (or other MCP clients), allowing you to access and use your locally installed Ollama models directly through the MCP protocol.

Requirements

Before you begin, make sure you have:

  • Python 3.10 or higher
  • Ollama installed and running on your system (download Ollama)
  • At least one model pulled with Ollama (e.g., run ollama pull llama2)

Installation

Installing the MCP Ollama Server

To install the MCP Ollama server, use the uvx package manager:

uvx install mcp-ollama

Configuring Claude Desktop

To integrate with Claude Desktop, you need to add the MCP server configuration to your Claude Desktop configuration file:

On macOS: Edit ~/Library/Application Support/Claude/claude_desktop_config.json

On Windows: Edit %APPDATA%\Claude\claude_desktop_config.json

Add the following configuration:

{
  "mcpServers": {
    "ollama": {
      "command": "uvx",
      "args": [
        "mcp-ollama"
      ]
    }
  }
}

Usage

Once installed and configured, the MCP Ollama server provides three main tools:

List Models

The list_models tool allows you to see all Ollama models you have downloaded locally:

[Tool selection interface in Claude Desktop]
Select: list_models

This will return a list of all available models in your Ollama installation.

Show Model Details

To get detailed information about a specific model:

[Tool selection interface in Claude Desktop]
Select: show_model
Model name: [enter your model name, e.g., llama2]

This provides detailed specifications and information about the selected model.

Ask a Model

To send a query directly to a specific Ollama model:

[Tool selection interface in Claude Desktop]
Select: ask_model
Model name: [enter your model name, e.g., llama2]
Question: [enter your query]

The response from the selected Ollama model will be displayed in Claude Desktop.

Troubleshooting

If you encounter issues:

  • Ensure Ollama is running in the background
  • Verify you've pulled at least one model using ollama pull [model-name]
  • Check that your Claude Desktop configuration file contains the correct MCP server configuration
  • Make sure you're using Python 3.10 or higher

How to add this MCP server to Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "cursor-rules-mcp": {
            "command": "npx",
            "args": [
                "-y",
                "cursor-rules-mcp"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later