home / mcp / openapi mcp server
Exposes OpenAPI endpoints as MCP tools with optional prompts and resources for streamlined AI interactions.
Configuration
View docs{
"mcpServers": {
"ivo-toby-mcp-openapi-server": {
"url": "http://localhost:3000/mcp",
"headers": {
"API_HEADERS": "Authorization:Bearer token123,X-API-Key:your-api-key",
"API_BASE_URL": "https://api.example.com",
"OPENAPI_SPEC_PATH": "https://api.example.com/openapi.json"
}
}
}
}You run an MCP server that exposes OpenAPI endpoints as MCP tools, enabling your AI systems to discover and interact with REST APIs defined by OpenAPI specifications through the MCP protocol. This server supports both local (stdio) and HTTP transport, making it easy to use with desktop MCP clients as well as web or networked clients.
You can use the server in two primary ways: as a CLI tool or as a library imported into your own Node.js application. It supports two transport modes: stdio for direct integration with MCP-aware clients, and streamable HTTP for web and HTTP-capable systems.
Prerequisites: you need Node.js (version 16 or newer is recommended). You also need a working internet connection to fetch dependencies when installing.
# Quick start via CLI (no local install required)
npx @ivotoby/openapi-mcp-server \
--api-base-url https://api.example.com \
--openapi-spec https://api.example.com/openapi.json \
--headers "Authorization:Bearer token123, X-API-Key: your-api-key" \
--transport http \
--port 3000The server accepts configuration through environment variables or CLI arguments. Common options include base API URL, OpenAPI specification path, authentication headers, transport type (stdio or http), and port for HTTP transport. You can also enable prompts and resources alongside your tools.
In addition to API endpoint tools, you can expose reusable prompts and static resources via the MCP protocol. Prompts are templated messages with placeholders, while resources are read-only content such as API documentation or schemas. You can load prompts and resources from files, URLs, or inline JSON.
When using HTTP transport, you can check the server health at the health endpoint. The health response indicates the server status, number of active sessions, and uptime. Integration with orchestrators or load balancers can leverage this endpoint for readiness and liveness checks.
Security considerations include validating origins for the HTTP transport and binding by default to localhost. You can enhance security with authentication when exposing the server beyond the local machine. For troubleshooting, enable verbose logs in the HTTP transport mode to diagnose initialization and runtime issues.
If you prefer a programmatic approach, import the OpenAPIServer class in your Node.js project and configure it with your API, spec, and transport. This lets you tailor authentication with an AuthProvider, apply endpoint filters, and embed the server into larger applications.
Leverage the library model to create dedicated MCP servers for specific APIs, distribute them as npm packages, and combine tools with prompts and resources for richer interactions. You can also implement dynamic authentication via the AuthProvider interface to handle expiring tokens and complex auth flows.
List available endpoints defined in the OpenAPI spec for exploration and selection.
Retrieve the input schema for a specific API endpoint.
Invoke a chosen API endpoint with provided parameters.
Tool corresponding to the GET /users endpoint.
Tool corresponding to the POST /users endpoint.