home / mcp / airflow mcp server
Provides access to Airflow data and actions via MCP for AI assistants
Configuration
View docs{
"mcpServers": {
"astronomer-astro-airflow-mcp": {
"url": "http://localhost:8000/mcp",
"headers": {
"AIRFLOW_API_URL": "https://your-airflow.example.com",
"AIRFLOW_PASSWORD": "admin",
"AIRFLOW_USERNAME": "admin",
"AIRFLOW_AUTH_TOKEN": "YOUR_TOKEN"
}
}
}
}You deploy an MCP server that exposes Airflow data and controls to AI assistants. It lets you query Airflow’s REST API, manage DAGs, tasks, pools, variables, connections, assets, plugins, and providers, and run guided workflows for health checks and troubleshooting. This makes it easier to automate QA, diagnostics, and day-to-day Airflow operations through conversational agents.
Connect your MCP client to either a local stdio-based server or a remote HTTP endpoint. You can run the server locally using the stdio transport, which enables seamless integration with MFA-ish workflows that expect a stream-based interface. You can also connect to a dedicated HTTP server endpoint if you prefer a networked setup for multiple clients.
Prerequisites: You need Python and/or Node tooling depending on how you deploy. The MCP server runtime is exposed via a command that can be run directly from PyPI. You will use the transport interface stdio for local runs or HTTP for remote deployments.
Step 1: Install the MCP runtime from PyPI. You can run it directly with the provided binary from PyPI without additional build steps.
Step 2: Start the MCP server in stdio mode. This runs as a local process that communicates with your client over standard input/output.
Step 3: Optionally configure the client to connect to a remote Airflow instance or to a local Airflow instance exposed via HTTP. You can pass the Airflow URL and credentials as environment variables or as CLI flags when launching the MCP server with an HTTP transport.
Be aware of the following environment variables that configure the Airflow connection: AIRFLOW_API_URL, AIRFLOW_USERNAME, AIRFLOW_PASSWORD, AIRFLOW_AUTH_TOKEN. These variables control how the MCP server authenticates to Airflow and which Airflow instance you’re targeting.
Two common deployment patterns exist: a standalone HTTP server and an Airflow plugin. In standalone mode, you run the MCP server as an independent ASGI application. In plugin mode, you install the MCP server into Airflow 3.x so the MCP endpoint is served from the Airflow webserver.
The server supports Bearer tokens for Airflow 2.x and 3.x, OAuth2 token exchange for Airflow 3.x, and basic auth for Airflow 2.x. Choose the authentication method that aligns with your Airflow deployment. When running in HTTP mode, ensure TLS termination and secure storage of credentials.
If you cannot reach the MCP endpoint, verify that the Airflow API URL is reachable and that credentials are valid. Check that the MCP service is listening on the expected port and that any firewalls permit traffic from your MCP clients.
Retrieve all DAGs with their metadata and basic details.
Fetch detailed information for a specific DAG.
Return the source code for a DAG.
Get DAG run statistics (Airflow 3.x only).
Start a new DAG run (execute a workflow).
Pause a DAG to stop scheduled runs.
Resume a paused DAG to continue scheduled runs.
List all tasks within a DAG.
Get details about a specific task.
Retrieve logs for a task instance run.
List all resource pools.
Get details about a specific pool.
List all Airflow variables.
Get a specific variable by key.
List all connections (credentials excluded for security).
List assets/datasets with unified naming across Airflow versions.
List installed Airflow plugins.
List installed provider packages.
Fetch Airflow configuration values.
Fetch Airflow version information.