home / skills / openai / openai-agents-python / openai-knowledge

openai-knowledge skill

/.codex/skills/openai-knowledge

This skill helps you access authoritative OpenAI API documentation via the MCP server, ensuring accurate guidance from the latest developer docs.

npx playbooks add skill openai/openai-agents-python --skill openai-knowledge

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
1.8 KB
---
name: openai-knowledge
description: Use when working with the OpenAI API (Responses API) or OpenAI platform features (tools, streaming, Realtime API, auth, models, rate limits, MCP) and you need authoritative, up-to-date documentation (schemas, examples, limits, edge cases). Prefer the OpenAI Developer Documentation MCP server tools when available; otherwise guide the user to enable `openaiDeveloperDocs`.
---

# OpenAI Knowledge

## Overview

Use the OpenAI Developer Documentation MCP server to search and fetch exact docs (markdown), then base your answer on that text instead of guessing.

## Workflow

### 1) Check whether the Docs MCP server is available

If the `mcp__openaiDeveloperDocs__*` tools are available, use them.

If you are unsure, run `codex mcp list` and check for `openaiDeveloperDocs`.

### 2) Use MCP tools to pull exact docs

- Search first, then fetch the specific page or pages.
  - `mcp__openaiDeveloperDocs__search_openai_docs` → pick the best URL.
  - `mcp__openaiDeveloperDocs__fetch_openai_doc` → retrieve the exact markdown (optionally with an `anchor`).
- When you need endpoint schemas or parameters, use:
  - `mcp__openaiDeveloperDocs__get_openapi_spec`
  - `mcp__openaiDeveloperDocs__list_api_endpoints`

Base your answer on the fetched text and quote or paraphrase it precisely. Do not invent flags, field names, defaults, or limits.

### 3) If MCP is not configured, guide setup (do not change config unless asked)

Provide one of these setup options, then ask the user to restart the Codex session so the tools load:

- CLI:
  - `codex mcp add openaiDeveloperDocs --url https://developers.openai.com/mcp`
- Config file (`~/.codex/config.toml`):
  - Add:
    ```toml
    [mcp_servers.openaiDeveloperDocs]
    url = "https://developers.openai.com/mcp"
    ```

Also point to: https://developers.openai.com/resources/docs-mcp#quickstart

Overview

This skill helps you answer questions about the OpenAI API and platform by fetching authoritative, up-to-date documentation. It prefers using the OpenAI Developer Documentation MCP server to return exact doc text, schemas, examples, and limits rather than guessing. Use this skill when you need precise endpoint parameters, response schemas, rate limits, or platform feature details.

How this skill works

First, the skill checks whether the openaiDeveloperDocs MCP server tools are available and uses them when present. It performs a guided two-step workflow: search the docs to find the best page, then fetch the exact markdown or OpenAPI spec for accurate quoting. If MCP tools are not configured, the skill provides clear setup instructions and asks you to restart the Codex session so the tools can load.

When to use it

  • You need exact endpoint parameters, schemas, or request/response examples for the Responses, Realtime, or other OpenAI APIs.
  • You must confirm current rate limits, auth behavior, or model availability rather than relying on memory.
  • You want to quote official developer docs or reproduce exact examples and edge-case guidance.
  • You are implementing multi-agent workflows that depend on platform features (tools, streaming, MCP).
  • You need authoritative guidance for troubleshooting integration or SDK usage.

Best practices

  • Prefer the MCP server tools; search first, then fetch specific pages or anchors for precise text.
  • When requesting schemas or endpoints, use the OpenAPI spec and endpoint list functions to avoid inventing fields or defaults.
  • If MCP is unavailable, follow the provided CLI or config-file setup steps and restart Codex before asking for docs.
  • Ask for the specific endpoint, model name, or feature and any relevant anchors to reduce fetch scope.
  • Treat fetched markdown as the source of truth and paraphrase or quote it accurately.

Example use cases

  • Verify the exact request body and response schema for the Responses API before implementing client code.
  • Confirm streaming and realtime API behavior, example frames, and limits for a low-latency agent.
  • Check current rate limits, model capabilities, or policy notes for production deployment planning.
  • Fetch the OpenAPI spec to generate client code or validate request parameters automatically.
  • Guide a user to enable the openaiDeveloperDocs MCP server when their environment lacks it.

FAQ

What if the MCP server tools are not present?

I will show two setup options (CLI and config file) and ask you to restart the Codex session so the MCP tools load.

Will you ever guess fields or limits?

No. When MCP is available I fetch and base answers on the exact docs or OpenAPI spec; when unavailable I instruct how to enable MCP so authoritative docs can be used.