Home / MCP / LMStudio MCP Server

LMStudio MCP Server

Bridges Claude MCP with locally running LM Studio models for health checks, model listing, and local text generation.

other
Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
    "mcpServers": {
        "lmstudio_mcp": {
            "command": "uvx",
            "args": [
                "https://github.com/infinitimeless/LMStudio-MCP"
            ]
        }
    }
}

LMStudio-MCP creates a bridge between Claude with MCP capabilities and your locally running LM Studio instance. It lets Claude check the health of LM Studio, list available models, identify the current model, and generate completions using your private local models. This enables you to leverage your own locally running models through Claude’s interface for fast, private inference and seamless integration.

How to use

You connect to the MCP server from your MCP client and use the available functions to interact with your local LM Studio setup. You can check that the local API is reachable, see which models you have available, determine which model is currently loaded, and generate text with your local model. These capabilities let you mix Claude’s features with models you control locally.

Practical usage patterns include: health checks to confirm the LM Studio API is online, listing models to choose a suitable model, getting the active model before starting a session, and sending chat prompts to generate completions from your locally hosted model. This setup keeps your data on your machine while still benefiting from Claude’s MCP interface.

How to install

Prerequisites you must have before installing the MCP bridge:

- Python 3.7+ must be installed on your system.

- LM Studio must be installed and running locally with a model loaded.

- Claude with MCP access.

Install using the recommended one-line installer:

curl -fsSL https://raw.githubusercontent.com/infinitimeless/LMStudio-MCP/main/install.sh | bash

Manual installation methods you can choose from:

1) Local Python installation

git clone https://github.com/infinitimeless/LMStudio-MCP.git
cd LMStudio-MCP
pip install requests "mcp[cli]" openai

2) Docker installation

# Using pre-built image
docker run -it --network host ghcr.io/infinitimeless/lmstudio-mcp:latest

# Or build locally
git clone https://github.com/infinitimeless/LMStudio-MCP.git
cd LMStudio-MCP
docker build -t lmstudio-mcp .
docker run -it --network host lmstudio-mcp

3) Docker Compose

git clone https://github.com/infinitimeless/LMStudio-MCP.git
cd LMStudio-MCP
docker-compose up -d

Available tools

health_check

Verify that the LM Studio API is reachable from the MCP bridge and returns a healthy status.

list_models

Retrieve the list of models currently available in LM Studio.

get_current_model

Identify which model is currently loaded in LM Studio.

chat_completion

Generate text completions by sending a prompt to the local LM Studio model through the MCP bridge.