Home / MCP / Airflow MCP Server
Provides an MCP interface to Apache Airflow REST endpoints for DAGs, DAG runs, tasks, variables, connections, and monitoring.
Configuration
View docs{
"mcpServers": {
"airflow_mcp_basic": {
"command": "uvx",
"args": [
"mcp-server-apache-airflow"
],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_USERNAME": "your-username",
"AIRFLOW_PASSWORD": "your-password",
"AIRFLOW_JWT_TOKEN": "eyJhbGciOiJI..."
}
}
}
}You can run an MCP server that wraps Apache Airflow’s REST API, exposing Airflow data and actions to MCP clients in a standardized way. This enables you to list DAGs, manage DAG runs, access task details and logs, and read configuration and monitoring information through a consistent interface.
You use this MCP server with an MCP client to interact with Airflow through the MCP API groups you enable. Start the server and, if desired, restrict it to read-only operations to prevent any changes. You can choose specific API groups to expose, such as dag, dagrun, variable, health, and more, by selecting them at startup. The server can be run in a local stdio mode (so you interact via an MCP runner locally) or via an installed CLI/manager that launches the server process.
Prerequisites: you need Node tooling to install and run MCP servers, and you will use the provided installer commands to bring up this server.
Install the MCP server via the official installer command for Claude Desktop clients (this pulls the server and makes it available to Claude):
npx -y @smithery/cli install @yangkyeongmo/mcp-server-apache-airflow --client claudeThere are multiple ways you can run the MCP server, depending on whether you want to use a local STDIO interface or the UV runtime. The following options are described in practical run configurations.
Run with UVX (recommended for Claude Desktop integration): use one of the following configurations. The first starts the server in standard mode, the second enables read-only mode.
{
"mcpServers": {
"airflow_mcp_basic": {
"command": "uvx",
"args": ["mcp-server-apache-airflow"],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_USERNAME": "your-username",
"AIRFLOW_PASSWORD": "your-password"
}
}
}
}{
"mcpServers": {
"airflow_mcp_readonly": {
"command": "uvx",
"args": ["mcp-server-apache-airflow", "--read-only"],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_JWT_TOKEN": "your-jwt-token"
}
}
}
}If you prefer using UV directly with a directory path, you can run the server with uv and point it to the local folder where the server package resides.
{
"mcpServers": {
"airflow_mcp_uv": {
"command": "uv",
"args": [
"--directory",
"/path/to/mcp-server-apache-airflow",
"run",
"mcp-server-apache-airflow"
],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_USERNAME": "your-username",
"AIRFLOW_PASSWORD": "your-password"
}
}
}
}You can limit exposed API groups to focus your MCP client on the endpoints you need. For example, you can enable only DAG and Variable APIs. You can also combine this with read-only mode to prevent destructive actions.
uv run mcp-server-apache-airflow --apis dag --apis variable{
"mcpServers": {
"mcp-server-apache-airflow": {
"command": "uvx",
"args": ["mcp-server-apache-airflow"],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_USERNAME": "your-username",
"AIRFLOW_PASSWORD": "your-password"
}
}
}
}If you want to run the server directly in a development or testing environment, you can start the service with make commands or use uv to run the source directly. The available options include port and transport type for the SSE interface.
make runmake run-sseList all DAGs using /api/v1/dags to discover available DAGs in Airflow.
Fetch details for a specific DAG via /api/v1/dags/{dag_id}.
Pause a DAG via /api/v1/dags/{dag_id} by updating its paused state.
Create, list, update, or delete DAG runs via /api/v1/dags/{dag_id}/dagRuns and related endpoints.
Check system health at /api/v1/health to verify Airflow availability.