Home / MCP / OpenAI MCP Server
Queries OpenAI models via MCP with support for o3-mini and gpt-4o-mini, customizable formatting, and error handling.
Configuration
View docs{
"mcpServers": {
"openai_mcp": {
"command": "python",
"args": [
"-m",
"src.mcp_server_openai.server",
"--openai-api-key",
"YOUR_OPENAI_API_KEY"
],
"env": {
"PYTHONPATH": "/path/to/your/mcp-server-openai"
}
}
}
}You can query OpenAI models directly through the MCP protocol by running a dedicated OpenAI MCP Server. This server lets Claude Desktop or compatible MCP clients forward queries to OpenAI models and receive structured responses, with multiple models supported and straightforward configuration.
To use this MCP server from your MCP client, add the server you configured as a connected MCP endpoint. Then invoke the single available tool to ask questions or request model-based answers. You can select between models like o3-mini for concise replies or gpt-4o-mini for more detailed explanations. The server formats responses in a consistent, easy-to-consume structure and handles errors and logging to simplify troubleshooting.
Prerequisites you need before installation are Python 3.10 or newer and an OpenAI API key. You also need a functional Python environment and the ability to install Python packages.
Step 1. Clone the repository and install the package locally.
# Clone the repository
git clone https://github.com/thadius83/mcp-server-openai.git
cd mcp-server-openai
# Install dependencies
pip install -e .Step 2. Configure Claude Desktop to recognize the MCP server. Add the following configuration to your MCP settings, including any existing servers you already use.
{
"mcpServers": {
"github.com/thadius83/mcp-server-openai": {
"command": "python",
"args": [
"-m",
"src.mcp_server_openai.server",
"--openai-api-key",
"YOUR_OPENAI_API_KEY"
],
"env": {
"PYTHONPATH": "/path/to/your/mcp-server-openai"
},
"disabled": false,
"autoApprove": []
}
}
}Step 3. Start Claude Desktop and verify the new MCP server appears under your configured MCP endpoints. If you previously edited the file, restart Claude to apply the changes.
Configuration notes - The MCP server runs as a local process that you invoke from Claude Desktop using a Python command. The important parts are the Python module path, the API key flag, and the path to the server in your PYTHONPATH.
Security considerations - Treat your OpenAI API key as sensitive. Do not expose it in shared configurations. Use the provided placeholder and replace it with a secure value in your environment.
Troubleshooting tips - If the server cannot be found, confirm PYTHONPATH points to the correct directory and that Python is installed. - If authentication fails, verify the API key is valid and correctly passed to the server. - If you see model errors, ensure you are using a supported model (o3-mini or gpt-4o-mini) and that your query is well-formed.
Ask OpenAI assistant models a direct question