home / mcp / airflow mcp server

Airflow MCP Server

Provides access to Airflow data and actions via MCP for AI assistants

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "astronomer-astro-airflow-mcp": {
      "url": "http://localhost:8000/mcp",
      "headers": {
        "AIRFLOW_API_URL": "https://your-airflow.example.com",
        "AIRFLOW_PASSWORD": "admin",
        "AIRFLOW_USERNAME": "admin",
        "AIRFLOW_AUTH_TOKEN": "YOUR_TOKEN"
      }
    }
  }
}

You deploy an MCP server that exposes Airflow data and controls to AI assistants. It lets you query Airflow’s REST API, manage DAGs, tasks, pools, variables, connections, assets, plugins, and providers, and run guided workflows for health checks and troubleshooting. This makes it easier to automate QA, diagnostics, and day-to-day Airflow operations through conversational agents.

How to use

Connect your MCP client to either a local stdio-based server or a remote HTTP endpoint. You can run the server locally using the stdio transport, which enables seamless integration with MFA-ish workflows that expect a stream-based interface. You can also connect to a dedicated HTTP server endpoint if you prefer a networked setup for multiple clients.

How to install

Prerequisites: You need Python and/or Node tooling depending on how you deploy. The MCP server runtime is exposed via a command that can be run directly from PyPI. You will use the transport interface stdio for local runs or HTTP for remote deployments.

Step 1: Install the MCP runtime from PyPI. You can run it directly with the provided binary from PyPI without additional build steps.

Step 2: Start the MCP server in stdio mode. This runs as a local process that communicates with your client over standard input/output.

Step 3: Optionally configure the client to connect to a remote Airflow instance or to a local Airflow instance exposed via HTTP. You can pass the Airflow URL and credentials as environment variables or as CLI flags when launching the MCP server with an HTTP transport.

Configuration notes

Be aware of the following environment variables that configure the Airflow connection: AIRFLOW_API_URL, AIRFLOW_USERNAME, AIRFLOW_PASSWORD, AIRFLOW_AUTH_TOKEN. These variables control how the MCP server authenticates to Airflow and which Airflow instance you’re targeting.

Two common deployment patterns exist: a standalone HTTP server and an Airflow plugin. In standalone mode, you run the MCP server as an independent ASGI application. In plugin mode, you install the MCP server into Airflow 3.x so the MCP endpoint is served from the Airflow webserver.

Security and authentication

The server supports Bearer tokens for Airflow 2.x and 3.x, OAuth2 token exchange for Airflow 3.x, and basic auth for Airflow 2.x. Choose the authentication method that aligns with your Airflow deployment. When running in HTTP mode, ensure TLS termination and secure storage of credentials.

Troubleshooting tips

If you cannot reach the MCP endpoint, verify that the Airflow API URL is reachable and that credentials are valid. Check that the MCP service is listening on the expected port and that any firewalls permit traffic from your MCP clients.

Available tools

list_dags

Retrieve all DAGs with their metadata and basic details.

get_dag_details

Fetch detailed information for a specific DAG.

get_dag_source

Return the source code for a DAG.

get_dag_stats

Get DAG run statistics (Airflow 3.x only).

trigger_dag

Start a new DAG run (execute a workflow).

pause_dag

Pause a DAG to stop scheduled runs.

unpause_dag

Resume a paused DAG to continue scheduled runs.

list_tasks

List all tasks within a DAG.

get_task

Get details about a specific task.

get_task_logs

Retrieve logs for a task instance run.

list_pools

List all resource pools.

get_pool

Get details about a specific pool.

list_variables

List all Airflow variables.

get_variable

Get a specific variable by key.

list_connections

List all connections (credentials excluded for security).

list_assets

List assets/datasets with unified naming across Airflow versions.

list_plugins

List installed Airflow plugins.

list_providers

List installed provider packages.

get_airflow_config

Fetch Airflow configuration values.

get_airflow_version

Fetch Airflow version information.