Home / MCP / Claude LMStudio MCP Server
Bridges Claude with local LM Studio models to list models, generate text, and perform chat completions.
Configuration
View docs{
"mcpServers": {
"lmstudio_bridge": {
"command": "/bin/bash",
"args": [
"/path/to/claude-lmstudio-bridge/run_server.sh"
],
"env": {
"LMSTUDIO_HOST": "127.0.0.1",
"LMSTUDIO_PORT": "1234",
"DEBUG": "false"
}
}
}
}You can bridge Claude with your local LM Studio models using this MCP server. It lets Claude discover locally running models, generate text, handle chat completions, and perform a health check against LM Studio, all through a standard MCP interface that you can control from Claude Desktop.
You will configure an MCP client in Claude Desktop to point to the local bridge server. Once set up, you can ask Claude to check connectivity, list available models, generate text with a local model, or send a chat completion request to your LM Studio instance. The bridge exposes these capabilities so you can operate your local LLMs directly from Claude.
Follow these steps to set up the bridge and connect it to Claude Desktop. The setup includes using the provided MCP configurations to run the local server process.
# macOS/Linux quick start
git clone https://github.com/infinitimeless/claude-lmstudio-bridge.git
cd claude-lmstudio-bridge
chmod +x setup.sh
./setup.sh
# Follow the setup prompts to complete Claude Desktop configurationREM Windows quick start
git clone https://github.com/infinitimeless/claude-lmstudio-bridge.git
cd claude-lmstudio-bridge
setup.bat
REM Follow the setup prompts to complete Claude Desktop configurationIf you prefer manual setup, create and activate a virtual environment, install dependencies, and configure Claude Desktop to point at the bridge runner scripts as shown below.
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
# Configure Claude Desktop MCP server using one of the explicit commands below# Manual MCP server configuration (macOS/Linux)
Name: lmstudio_bridge_linux
Command: /bin/bash
Arguments: /path/to/claude-lmstudio-bridge/run_server.sh# Manual MCP server configuration (Windows)
Name: lmstudio_bridge_win
Command: cmd.exe
Arguments: /c C:\path\to\claude-lmstudio-bridge\run_server.batConfiguration details, troubleshooting tips, and optional advanced settings are provided below to ensure a smooth setup and reliable operation.
Use the debugging tool to verify connectivity and perform detailed tests against LM Studio.
python debug_lmstudio.pypython debug_lmstudio.py --test-chat --verboseCommon issues and quick fixes include verifying that LM Studio is running, the API server is enabled, and the port matches your environment file. If a model isn’t loaded, start or load a model in LM Studio.
You can customize the bridge behavior by setting environment variables in a local .env file. This is useful for directing the bridge to your LM Studio instance and controlling verbose output during troubleshooting.
LMSTUDIO_HOST=127.0.0.1
LMSTUDIO_PORT=1234
DEBUG=falseHealth check endpoint to verify connectivity with LM Studio and confirm that the API server is responding.
List all available models currently loaded or discoverable in LM Studio.
Generate text using a specified local model, returning the produced text.
Submit a chat-style prompt to a local model and retrieve a conversational completion.