home / mcp / openai assistant mcp server

OpenAI Assistant MCP Server

MCP server that gives Claude ability to use OpenAI's GPTs assistants

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "andybrandt-mcp-simple-openai-assistant": {
      "command": "python",
      "args": [
        "-m",
        "mcp_simple_openai_assistant"
      ],
      "env": {
        "OPENAI_API_KEY": "YOUR_API_KEY_HERE"
      }
    }
  }
}

You can deploy MCP Simple OpenAI Assistant to interact with OpenAI assistants through the Model Context Protocol. This server enables you to create, manage, and converse with assistants in real time, with local thread persistence to simplify reuse across sessions.

How to use

After you have the server running, you interact with it through your MCP client by using the available tools. The primary workflow is to start a new named conversation, locate the thread you want to continue, and ask questions to your chosen assistant within that thread. Real-time streaming updates keep you informed as the assistant generates responses, improving responsiveness during long or complex interactions.

How to install

Prerequisites you need before installing: Python (3.8+), and a running MCP client capable of interfacing with HTTP or stdio MCP servers.

Option A: Install via Smithery for automatic client integration with Claude Desktop.

npx -y @smithery/cli install mcp-simple-openai-assistant --client claude

Option B: Manual installation with PythonPackage.

pip install mcp-simple-openai-assistant

Configuration

You must provide an OpenAI API key to enable the OpenAI models. The following local configurations show two environment setups for macOS and Windows. Use the one that matches your operating system.

{
  "mcpServers": {
    "openai-assistant": {
      "command": "python",
      "args": ["-m", "mcp_simple_openai_assistant"],
      "env": {
        "OPENAI_API_KEY": "your-api-key-here"
      }
    }
  }
}

Windows configuration example shows the explicit Python executable path. Use the key Python path that matches your system.

{
  "mcpServers": {
    "openai-assistant": {
      "command": "C:\\Users\\YOUR_USERNAME\\AppData\\Local\\Programs\\Python\\Python311\\python.exe",
      "args": ["-m", "mcp_simple_openai_assistant"],
      "env": {
        "OPENAI_API_KEY": "your-api-key-here"
      }
    }
  }
}

Notes on usage and persistence

The server uses a streaming approach for the main interaction to provide real-time progress updates and avoid timeouts. It also persists conversation threads locally using a SQLite database so you can reuse threads across sessions. You can create a new thread, list existing threads, and delete threads as needed.

Troubleshooting and tips

If you encounter issues starting the server, verify that Python is installed and that the OPENAI_API_KEY is accessible in your environment. Ensure you are using the correct Python executable path on Windows. If you experience connection problems, confirm that your MCP client is configured to point to the correct stdio MCP server and port.

Available tools

create_assistant

Create a new OpenAI assistant with a name, instructions, and model specification.

list_assistants

List all assistants associated with your API key.

retrieve_assistant

Get detailed information about a specific assistant by its ID.

update_assistant

Modify an existing assistant's name, instructions, or model.

create_new_assistant_thread

Start a new persistent conversation thread with a user-defined name and description.

list_threads

List locally managed conversation threads stored in the local SQLite database.

delete_thread

Delete a conversation thread from both OpenAI servers and the local database.

ask_assistant_in_thread

Send a message to an assistant within a thread and stream the response in real time.