Home / MCP / OpenAI MCP Server

OpenAI MCP Server

Queries OpenAI models via MCP from Claude, enabling centralized configuration and easy client access.

python
Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
    "mcpServers": {
        "openai_mcp": {
            "command": "python",
            "args": [
                "-m",
                "src.mcp_server_openai.server"
            ],
            "env": {
                "PYTHONPATH": "C:/path/to/your/mcp-server-openai",
                "OPENAI_API_KEY": "your-key-here"
            }
        }
    }
}

You can query OpenAI models directly from Claude using an MCP (Multi-Client Protocol) server. This lets you route requests through a dedicated server process, enabling seamless integration with your MCP client workflows while keeping API keys and configuration centralized.

How to use

Launch the OpenAI MCP server as a local stdio service and connect your MCP client to it. The server runs a Python module that exposes an MCP endpoint for OpenAI model calls. Ensure you provide your OpenAI API key and point the server to your local MCP setup. Once running, you can send requests to the server from your client as you would with any MCP endpoint, and you’ll receive OpenAI model responses back through the MCP channel.

How to install

Prerequisites you need before installing this MCP server are Python and a working environment for Python packages (pip). You also need an OpenAI API key to access the models.

Step 1: Install the MCP server locally by cloning the project and installing in editable mode.

git clone https://github.com/pierrebrunelle/mcp-server-openai
cd mcp-server-openai
pip install -e .

Step 2: Prepare environment variables and module path for the server. You’ll need to provide your OpenAI API key and set the Python path to the project location.

{
  "mcpServers": {
    "openai_mcp": {
      "command": "python",
      "args": ["-m", "src.mcp_server_openai.server"],
      "env": {
        "PYTHONPATH": "C:/path/to/your/mcp-server-openai",
        "OPENAI_API_KEY": "your-key-here"
      }
    }
  }
}

Step 3: Start using the server in your MCP client by loading the configuration above and connecting to the designated server name. If you use a desktop MCP tool, ensure it reads the same configuration block so it can launch python with the correct module and environment.

Additional sections

Configuration notes: the server is configured to run via Python and the module path is provided through PYTHONPATH. The OpenAI API key must be supplied securely via OPENAI_API_KEY.

Development workflow: you can set up the project locally using the standard Python packaging workflow and run tests with pytest if your environment includes the test suite. For example, after cloning the repository, install the package and run tests to verify the OpenAI integration.

Testing examples may show sample outputs from OpenAI responses, which helps validate the integration during development.