Ollama MCP server

Integrates Ollama's local LLM models with MCP-compatible applications, enabling on-premise AI processing and custom model deployment while maintaining data control.
Back to servers
Provider
tigreen
Release date
Feb 13, 2025
Language
TypeScript
Package
Stats
796 downloads
50 stars

The Ollama MCP Server enables integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop. It allows you to use your locally installed Ollama models with any application that supports the Model Context Protocol.

Prerequisites

  • Node.js (v16 or higher)
  • npm
  • Ollama installed and running locally

Installation Options

Global Installation

Install the server globally via npm:

npm install -g @rawveg/ollama-mcp

Integration with MCP Applications

To use Ollama MCP Server with applications like Cline or Claude Desktop, add this configuration to your application's MCP settings file:

{
  "mcpServers": {
    "@rawveg/ollama-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "@rawveg/ollama-mcp"
      ]
    }
  }
}

The settings file location depends on your application:

  • Claude Desktop: claude_desktop_config.json in the Claude app data directory
  • Cline: cline_mcp_settings.json in the VS Code global storage

Usage

Starting the Server

Run the server with the default settings:

ollama-mcp

By default, the server starts on port 3456. You can specify a different port if needed:

PORT=3457 ollama-mcp

Configuration Options

The server can be configured using environment variables:

Example with custom Ollama API endpoint:

OLLAMA_API=http://192.168.1.100:11434 ollama-mcp

Available Features

Once running, the Ollama MCP Server provides these capabilities:

  • List all available Ollama models
  • Pull new models from Ollama
  • Chat with models using Ollama's chat API
  • Get detailed information about specific models
  • Automatic port management

API Endpoints

The server exposes these endpoints:

  • GET /models - Lists all available models
  • POST /models/pull - Pulls a new model
  • POST /chat - Chats with a model
  • GET /models/:name - Gets model details

With the server running, MCP-compatible applications will automatically detect and use your local Ollama models, providing a seamless experience between local and cloud-based models.

How to add this MCP server to Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "cursor-rules-mcp": {
            "command": "npx",
            "args": [
                "-y",
                "cursor-rules-mcp"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later