The OpenAI WebSearch MCP Server provides intelligent web search capabilities through OpenAI's reasoning models, enabling AI assistants to access up-to-date information with smart reasoning capabilities.
OPENAI_API_KEY=sk-xxxx uvx --with openai-websearch-mcp openai-websearch-mcp-install
Replace sk-xxxx
with your OpenAI API key from the OpenAI Platform.
# Install and run directly
uvx openai-websearch-mcp
# Or install globally
uvx install openai-websearch-mcp
# Install from PyPI
pip install openai-websearch-mcp
# Run the server
python -m openai_websearch_mcp
Add to your claude_desktop_config.json
:
{
"mcpServers": {
"openai-websearch-mcp": {
"command": "uvx",
"args": ["openai-websearch-mcp"],
"env": {
"OPENAI_API_KEY": "your-api-key-here",
"OPENAI_DEFAULT_MODEL": "gpt-5-mini"
}
}
}
}
Add to your MCP settings in Cursor:
Cmd/Ctrl + ,
){
"mcpServers": {
"openai-websearch-mcp": {
"command": "uvx",
"args": ["openai-websearch-mcp"],
"env": {
"OPENAI_API_KEY": "your-api-key-here",
"OPENAI_DEFAULT_MODEL": "gpt-5-mini"
}
}
}
}
Claude Code automatically detects MCP servers configured for Claude Desktop. Use the same configuration as above for Claude Desktop.
For local testing, use the absolute path to your virtual environment:
{
"mcpServers": {
"openai-websearch-mcp": {
"command": "/path/to/your/project/.venv/bin/python",
"args": ["-m", "openai_websearch_mcp"],
"env": {
"OPENAI_API_KEY": "your-api-key-here",
"OPENAI_DEFAULT_MODEL": "gpt-5-mini",
"PYTHONPATH": "/path/to/your/project/src"
}
}
}
}
Intelligent web search with reasoning model support.
Parameter | Type | Description | Default |
---|---|---|---|
input |
string |
The search query or question to search for | Required |
model |
string |
AI model to use. Supports gpt-4o, gpt-4o-mini, gpt-5, gpt-5-mini, gpt-5-nano, o3, o4-mini | gpt-5-mini |
reasoning_effort |
string |
Reasoning effort level: low, medium, high, minimal | Smart default |
type |
string |
Web search API version | web_search_preview |
search_context_size |
string |
Context amount: low, medium, high | medium |
user_location |
object |
Optional location for localized results | null |
Once configured, simply ask your AI assistant to search for information using natural language:
"Search for the latest developments in AI reasoning models using openai_web_search"
"Use openai_web_search with gpt-5 and high reasoning effort to provide a comprehensive analysis of quantum computing breakthroughs"
"Search for local tech meetups in San Francisco this week using openai_web_search"
gpt-5-mini
with reasoning_effort: "low"
gpt-5
with reasoning_effort: "medium"
or "high"
Model | Reasoning | Default Effort | Best For |
---|---|---|---|
gpt-4o |
❌ | N/A | Standard search |
gpt-4o-mini |
❌ | N/A | Basic queries |
gpt-5-mini |
✅ | low |
Fast iterations |
gpt-5 |
✅ | medium |
Deep research |
gpt-5-nano |
✅ | medium |
Balanced approach |
o3 |
✅ | medium |
Advanced reasoning |
o4-mini |
✅ | medium |
Efficient reasoning |
Variable | Description | Default |
---|---|---|
OPENAI_API_KEY |
Your OpenAI API key | Required |
OPENAI_DEFAULT_MODEL |
Default model to use | gpt-5-mini |
# For uvx installations
npx @modelcontextprotocol/inspector uvx openai-websearch-mcp
# For pip installations
npx @modelcontextprotocol/inspector python -m openai_websearch_mcp
Issue: "Unsupported parameter: 'reasoning.effort'"
Solution: This occurs when using non-reasoning models (gpt-4o, gpt-4o-mini) with reasoning_effort parameter. The server automatically handles this by only applying reasoning parameters to compatible models.
Issue: "No module named 'openai_websearch_mcp'"
Solution: Ensure you've installed the package correctly and your Python path includes the package location.
To add this MCP server to Claude Code, run this command in your terminal:
claude mcp add-json "openai-websearch-mcp" '{"command":"uvx","args":["openai-websearch-mcp"],"env":{"OPENAI_API_KEY":"your-api-key-here"}}'
See the official Claude Code MCP documentation for more details.
There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json
file so that it is available in all of your projects.
If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json
file.
To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".
When you click that button the ~/.cursor/mcp.json
file will be opened and you can add your server like this:
{
"mcpServers": {
"openai-websearch-mcp": {
"command": "uvx",
"args": [
"openai-websearch-mcp"
],
"env": {
"OPENAI_API_KEY": "your-api-key-here"
}
}
}
}
To add an MCP server to a project you can create a new .cursor/mcp.json
file or add it to the existing one. This will look exactly the same as the global MCP server example above.
Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.
The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.
You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.
To add this MCP server to Claude Desktop:
1. Find your configuration file:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%\Claude\claude_desktop_config.json
~/.config/Claude/claude_desktop_config.json
2. Add this to your configuration file:
{
"mcpServers": {
"openai-websearch-mcp": {
"command": "uvx",
"args": [
"openai-websearch-mcp"
],
"env": {
"OPENAI_API_KEY": "your-api-key-here"
}
}
}
}
3. Restart Claude Desktop for the changes to take effect