home / mcp / openai tool mcp server

OpenAI Tool MCP Server

mcp wrapper for openai built-in tools

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "alohays-openai-tool2mcp": {
      "command": "uv",
      "args": [
        "run",
        "openai_tool2mcp/server_entry.py",
        "--transport",
        "stdio"
      ],
      "env": {
        "OPENAI_API_KEY": "YOUR_API_KEY"
      }
    }
  }
}

You can run OpenAI’s mature built-in tools as MCP-compliant servers, letting Claude App and other MCP clients access web search, code interpreter, and more through a lightweight bridge. This server wraps those tools using the Model Context Protocol, enabling seamless interoperability in an open ecosystem.

How to use

You interact with the server through an MCP client as a standard MCP endpoint. Start the local MCP server, then point your MCP client to the server so queries requiring tools are translated to OpenAI’s tools and results are returned in MCP format. You can run the server via two primary methods: use uv to host the server locally, or run the built-in CLI command to start the MCP server. Choose the method that fits your workflow and client configuration.

How to install

Prerequisites: Python 3.10 or newer and an OpenAI API key with access to the Assistant API. It is recommended to install uv for better MCP compatibility.

# Install the package from PyPI
pip install openai-tool2mcp

# Or install the latest development version
pip install git+https://github.com/alohays/openai-tool2mcp.git

# Recommended: Install uv for MCP compatibility
pip install uv

Additional setup steps

Set your OpenAI API key in your environment before starting the server. This key enables access to the OpenAI Assistant API.

export OPENAI_API_KEY="your-api-key-here"

Start the MCP server

You have two equivalent ways to start the server, depending on whether you prefer the uv runner or the direct CLI. Both start the server in STDIO transport mode, which is compatible with MCP clients.

# Preferred: use uv for MCP compatibility
uv run openai_tool2mcp/server_entry.py --transport stdio
```
```bash
# Or use the traditional CLI start command
openai-tool2mcp start --transport stdio

Connect via Claude Desktop or other MCP clients

Configure your MCP client to reach the local server. You will provide a command that runs the MCP server and keeps STDIO transport. The following example shows how to configure Claude Desktop to launch the server using uv. You can adapt to other MCP clients with equivalent configuration.

{
  "mcpServers": {
    "openai-tools": {
      "command": "uv",
      "args": [
        "--directory",
        "/absolute/path/to/your/openai-tool2mcp",
        "run",
        "openai_tool2mcp/server_entry.py"
      ]
    }
  }
}

Usage notes

If you prefer a CLI-based startup, you can use the following alternative. Ensure the working directory contains the server entry script.

uv run openai_tool2mcp/server_entry.py --transport stdio
```
```bash
openai-tool2mcp start --transport stdio

Available tools

WEB_SEARCH

OpenAI built-in web search tool wrapped as an MCP tool for performing live web queries from within MCP clients.

CODE_INTERPRETER

Code interpreter tool functionality exposed as an MCP service to run and evaluate code within the MCP workflow.

WEB_BROWSER

Web browser capability exposed as an MCP service to fetch and view web content from MCP clients.

FILE_MANAGEMENT

File management tools exposed as an MCP service for handling files within the MCP environment.