Claude-LMStudio Bridge MCP server

Bridges Claude with local LLMs running in LM Studio, enabling direct access to local model capabilities for text generation and chat completions while reducing cloud dependency.
Back to servers
Provider
infinitimeless
Release date
Mar 21, 2025
Language
Python
Stats
2 stars

This MCP server acts as a bridge between Claude and locally hosted language models running in LM Studio. It enables Claude to access your local models for text generation, chat completions, and other language model functions without sending data to external APIs.

Prerequisites

  • Claude Desktop with MCP support
  • LM Studio installed and running locally with API server enabled
  • Python 3.8+ installed

Installation

Quick Setup (Recommended)

For macOS/Linux:

  1. Clone the repository
git clone https://github.com/infinitimeless/claude-lmstudio-bridge.git
cd claude-lmstudio-bridge
  1. Run the setup script
chmod +x setup.sh
./setup.sh
  1. Follow the setup script's instructions to configure Claude Desktop

For Windows:

  1. Clone the repository
git clone https://github.com/infinitimeless/claude-lmstudio-bridge.git
cd claude-lmstudio-bridge
  1. Run the setup script
setup.bat
  1. Follow the setup script's instructions to configure Claude Desktop

Manual Installation

If you prefer setting things up manually:

  1. Create a virtual environment (optional but recommended)
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
  1. Install the required packages
pip install -r requirements.txt
  1. Configure Claude Desktop:
    • Open Claude Desktop preferences
    • Navigate to the 'MCP Servers' section
    • Add a new MCP server with the following configuration:
      • Name: lmstudio-bridge
      • Command: /bin/bash (on macOS/Linux) or cmd.exe (on Windows)
      • Arguments:
        • macOS/Linux: /path/to/claude-lmstudio-bridge/run_server.sh
        • Windows: /c C:\path\to\claude-lmstudio-bridge\run_server.bat

Using the Bridge with Claude

After setting up the bridge, you can use these commands in Claude:

Check Connection Status

To verify if LM Studio is properly connected:

Can you check if my LM Studio server is running?

List Available Models

To see all models loaded in LM Studio:

List the available models in my local LM Studio

Generate Text with Local Models

To use your local model for text generation:

Generate a short poem about spring using my local LLM

Chat Completions

To send a query to your local model:

Ask my local LLM: "What are the main features of transformers in machine learning?"

Troubleshooting

Diagnosing Connection Issues

Use the included debugging tool:

python debug_lmstudio.py

For more detailed tests:

python debug_lmstudio.py --test-chat --verbose

Common Issues and Solutions

"Cannot connect to LM Studio API"

  • Ensure LM Studio is running
  • Verify the API server is enabled in LM Studio (Settings > API Server)
  • Check that the port (default: 1234) matches what's in your .env file

"No models are loaded"

  • Open LM Studio and load a model
  • Verify the model is running successfully

"MCP package not found"

  • Reinstall required packages: pip install "mcp[cli]" httpx python-dotenv
  • Make sure you're using Python 3.8 or later

"Claude can't find the bridge"

  • Check Claude Desktop configuration
  • Ensure the path to run_server.sh or run_server.bat is correct and absolute
  • Verify the server script is executable: chmod +x run_server.sh (on macOS/Linux)

Advanced Configuration

Create a .env file to customize settings:

LMSTUDIO_HOST=127.0.0.1
LMSTUDIO_PORT=1234
DEBUG=false

Set DEBUG=true to enable verbose logging for troubleshooting.

How to add this MCP server to Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "cursor-rules-mcp": {
            "command": "npx",
            "args": [
                "-y",
                "cursor-rules-mcp"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later