home / skills / openai / openai-agents-js / openai-knowledge

openai-knowledge skill

/.codex/skills/openai-knowledge

This skill helps you retrieve authoritative OpenAI API documentation and endpoint schemas using MCP tools, ensuring accurate, up-to-date guidance.

npx playbooks add skill openai/openai-agents-js --skill openai-knowledge

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
1.8 KB
---
name: openai-knowledge
description: Use when working with the OpenAI API (Responses API) or OpenAI platform features (tools, streaming, Realtime API, auth, models, rate limits, MCP) and you need authoritative, up-to-date documentation (schemas, examples, limits, edge cases). Prefer the OpenAI Developer Documentation MCP server tools when available; otherwise guide the user to enable `openaiDeveloperDocs`.
---

# OpenAI Knowledge

## Overview

Use the OpenAI Developer Documentation MCP server to search and fetch exact docs (markdown), then base your answer on that text instead of guessing.

## Workflow

### 1) Check whether the Docs MCP server is available

If the `mcp__openaiDeveloperDocs__*` tools are available, use them.

If you are unsure, run `codex mcp list` and check for `openaiDeveloperDocs`.

### 2) Use MCP tools to pull exact docs

- Search first, then fetch the specific page(s).
  - `mcp__openaiDeveloperDocs__search_openai_docs` → pick the best URL.
  - `mcp__openaiDeveloperDocs__fetch_openai_doc` → retrieve the exact markdown (optionally with an `anchor`).
- When you need endpoint schemas or parameters, use:
  - `mcp__openaiDeveloperDocs__get_openapi_spec`
  - `mcp__openaiDeveloperDocs__list_api_endpoints`

Base your answer on the fetched text and quote or paraphrase it precisely. Do not invent flags, field names, defaults, or limits.

### 3) If MCP is not configured, guide setup (do not change config unless asked)

Provide one of these setup options, then ask the user to restart the Codex session so the tools load:

- CLI:
  - `codex mcp add openaiDeveloperDocs --url https://developers.openai.com/mcp`
- Config file (`~/.codex/config.toml`):
  - Add:
    ```toml
    [mcp_servers.openaiDeveloperDocs]
    url = "https://developers.openai.com/mcp"
    ```

Also point to: https://developers.openai.com/resources/docs-mcp#quickstart

Overview

This skill provides authoritative, up-to-date guidance when working with the OpenAI API, Realtime API, platform features, rate limits, and model schemas. It prefers the OpenAI Developer Documentation MCP server for exact documentation and gives clear setup instructions when the MCP is not available. Use it to avoid guessing fields, defaults, or limits and to cite precise docs.

How this skill works

First it checks for the presence of the OpenAI Developer Documentation MCP tools (mcp__openaiDeveloperDocs__*). If available, it runs a targeted search and fetch workflow to retrieve exact markdown pages, OpenAPI specs, or endpoint lists. If the MCP tools are not configured, it provides concise setup instructions for adding the MCP server and asks the user to restart the session so the tools load.

When to use it

  • When you need exact API schemas, parameter lists, or example requests/responses.
  • When implementing Realtime API, streaming, or auth flows and you want authoritative limits or edge-case behavior.
  • When troubleshooting rate limits, model capabilities, or platform tool integrations.
  • When generating code that must match official API field names and types.
  • When you must cite or paraphrase official documentation rather than rely on memory.

Best practices

  • If MCP tools are missing, present the two setup options (CLI and config file) and instruct the user to restart the Codex session so the tools become available.
  • When quoting docs, paraphrase precisely and cite the fetched page; mark any limitations or versioning noted in the source.
  • For implementation, validate against the fetched OpenAPI spec before deploying changes to ensure compatibility.

Example use cases

  • Implementing a streaming Realtime client and needing exact event names and payload schemas.
  • Checking model rate limits and retry headers to implement correct backoff logic.
  • Generating typed TypeScript clients from the exact OpenAPI spec for production use.
  • Verifying auth flows and tool integrations before releasing multi-agent workflows.
  • Resolving ambiguity about a parameter or response field by fetching the official doc page and quoting it.

FAQ

What if the MCP server tools are not available in my session?

Follow the provided setup steps: either run 'codex mcp add openaiDeveloperDocs --url https://developers.openai.com/mcp' or add the [mcp_servers.openaiDeveloperDocs] entry to ~/.codex/config.toml, then restart the Codex session so the tools load.

Which MCP tool should I call to get endpoint schemas?

Use mcp__openaiDeveloperDocs__get_openapi_spec for the full OpenAPI specification or mcp__openaiDeveloperDocs__list_api_endpoints to list endpoints and pick specifics to fetch.

Can I rely on this skill for rate limit values and defaults?

Yes—only when the MCP server is used to fetch the official docs. The skill will base answers on fetched markdown or OpenAPI specs and will not invent limits or defaults.