home / mcp / o3 search mcp server
MCP server for OpenAI o3 web search
Configuration
View docs{
"mcpServers": {
"yoshiko-pg-o3-search-mcp": {
"command": "npx",
"args": [
"o3-search-mcp"
],
"env": {
"OPENAI_MODEL": "o3",
"OPENAI_API_KEY": "your-api-key",
"REASONING_EFFORT": "medium",
"OPENAI_API_TIMEOUT": "300000",
"OPENAI_MAX_RETRIES": "3",
"SEARCH_CONTEXT_SIZE": "medium"
}
}
}
}You can unlock cutting-edge OpenAI models and live web search capabilities by running the o3-search MCP server. This server lets an AI agent consult OpenAI models and browse sources to solve complex problems, making it easier to debug, reference current library information, and tackle multi-step design tasks.
Install and run the o3-search MCP server so your AI agent can access OpenAI models (o3, o4-mini, or gpt-5) and perform web searches. When you register this MCP with your agent, the agent can autonomously consult with the chosen model and browse sources to provide informed answers, design reviews, and up-to-date library guidance. You can use the server to: debug startup issues by having the agent search for known solutions, upgrade libraries with live references, and brainstorm complex designs with a connected design reviewer.
Prerequisites: you need Node.js and a package manager (npm or pnpm). Ensure you have a valid OpenAI API key. You will also set environment variables to configure the model and search behavior.
{
"mcpServers": {
"o3-search": {
"command": "npx",
"args": ["o3-search-mcp"],
"env": {
"OPENAI_API_KEY": "your-api-key",
// Optional: o3, o4-mini, gpt-5 (default: o3)
"OPENAI_MODEL": "o3",
// Optional: low, medium, high (default: medium)
"SEARCH_CONTEXT_SIZE": "medium",
"REASONING_EFFORT": "medium",
// Optional: API timeout in milliseconds (default: 300000)
"OPENAI_API_TIMEOUT": "300000",
// Optional: Maximum number of retries (default: 3)
"OPENAI_MAX_RETRIES": "3"
}
}
}
}If you prefer to run the MCP server locally, clone the project, install dependencies, and start the server using Node. The following steps mirror the available local setup flow.
# Clone the project
$ git clone [email protected]:yoshiko-pg/o3-search-mcp.git
$ cd o3-search-mcp
# Install dependencies
$ pnpm install
# Build the project
$ pnpm build
# Run the MCP server locally (example with built index)
$ node /path/to/o3-search-mcp/build/index.jsTwo common ways to configure the MCP server are shown below. Use the method that matches how you want to run the server in your environment.
{
"mcpServers": {
"o3-search": {
"command": "node",
"args": ["/path/to/o3-search-mcp/build/index.js"],
"env": {
"OPENAI_API_KEY": "your-api-key",
// Optional: o3, o4-mini, gpt-5 (default: o3)
"OPENAI_MODEL": "o3",
// Optional: low, medium, high (default: medium)
"SEARCH_CONTEXT_SIZE": "medium",
"REASONING_EFFORT": "medium",
// Optional: API timeout in milliseconds (default: 300000)
"OPENAI_API_TIMEOUT": "300000",
// Optional: Maximum number of retries (default: 3)
"OPENAI_MAX_RETRIES": "3"
}
}
}
}Configure the following environment variables to control model selection and performance.
OPENAI_API_KEY=your-api-key
OPENAI_MODEL=o3 # optional: o3, o4-mini, gpt-5
SEARCH_CONTEXT_SIZE=medium
REASONING_EFFORT=medium
OPENAI_API_TIMEOUT=300000
OPENAI_MAX_RETRIES=3To access the o3 model through the OpenAI API, you may need to upgrade your OpenAI tier or verify your organization. If your API key is not enabled for o3, requests will fail. Ensure you understand the model access requirements and keep your API key secure.