home / mcp / mcp-airflow-api mcp server

MCP-Airflow-API MCP Server

๐Ÿ”Model Context Protocol (MCP) server for Apache Airflow API integration. Provides comprehensive tools for managing Airflow clusters including service operations, configuration management, status monitoring, and request tracking.

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "call518-mcp-airflow-api": {
      "url": "http://localhost:8000/mcp",
      "headers": {
        "REMOTE_AUTH_ENABLE": "true",
        "AIRFLOW_API_VERSION": "v2",
        "AIRFLOW_API_BASE_URL": "http://localhost:8080/api",
        "AIRFLOW_API_PASSWORD": "airflow",
        "AIRFLOW_API_USERNAME": "airflow"
      }
    }
  }
}

MCP-Airflow-API lets you manage Apache Airflow clusters through natural language by using the Model Context Protocol. You interact with a single MCP server that loads the correct Airflow API toolset and translates your natural language requests into Airflow REST API actions, enabling intuitive workflow management without writing REST queries.

How to use

You connect an MCP client to the MCP-Airflow-API server and start issuing natural language requests. For example, you can ask to list running DAGs, trigger a DAG, or inspect task instances. The server chooses the appropriate API version based on configuration and runs the corresponding toolset. Use a compatible MCP client configuration to point to either a local or remote MCP server. When authentication is enabled for remote access, include the Bearer token in your client requests.

How to install

Prerequisites: you need Python for the server, and a client capable of MCP communication. You also may want Docker and Docker Compose for a complete demo environment.

Step-by-step setup using a local development workflow and a ready-made demo environment:

git clone https://github.com/call518/MCP-Airflow-API.git
cd MCP-Airflow-API
# Optional: install dependencies and run in development mode
pip install -e .
# Run in stdio mode
python -m mcp_airflow_api

# For a Docker-based quickstart, you can use the companion demo environment
# See the Quickstart section for details on Docker Compose setup.

Additional sections

Security, configuration, and advanced usage details are provided to help you run MCP-Airflow-API safely in development and production. You can enable Bearer token authentication for remote access, switch API versions via environment variables, and configure multiple Airflow clusters if needed. The server exposes a web UI and an API endpoint, and you can access documentation and status endpoints once the server is running.

Configuration and security notes

The MCP server supports two transport modes: stdio for local usage and streamable-http for Docker or remote deployment. You control the mode with environment variables and can enable Bearer token authentication for remote access. Always enable authentication in production when using remote access.

Common environment variables you will use include API version, base URL, and credentials for Airflow. You can also set the MCP port and logging level. For security, use strong secret keys and enable HTTPS when possible behind a reverse proxy.

Troubleshooting

If you cannot connect, verify that the MCP server is running, the configured port is correct, and the network allows access. Check logs for authentication errors if you are using streamable-http with Bearer tokens. Ensure the Airflow API base URL and credentials are reachable from the MCP server.

Examples and common use cases

Use cases include asking for current DAGs, monitoring cluster health, inspecting task durations, and querying configuration settings. You can tailor your queries to filter by DAG IDs, statuses, or time ranges, and use pagination to handle large environments.

Available tools

list_dags

List DAGs with optional filters and pagination to view current workload.

get_dags_detailed_batch

Get detailed information for multiple DAGs, including latest run data.

running_dags

Show DAGs that are currently running.

failed_dags

Show DAGs with failed runs.

trigger_dag

Trigger a specific DAG run.

pause_dag

Pause a DAG to stop scheduling.

unpause_dag

Unpause a DAG to resume scheduling.

get_health

Check the health status of the Airflow cluster.

get_version

Retrieve Airflow version information.

list_pools

List worker pools and their usage.

get_pool

Get details for a specific pool.

list_variables

List Airflow variables and their values.

get_variable

Get the value of a specific Airflow variable.

list_task_instances_all

List task instances for a given DAG, with optional filters.

list_xcom_entries

List XCom entries for a task or DAG.

get_xcom_entry

Get a specific XCom value by key.

get_config

Show Airflow configuration settings.

list_config_sections

List all configuration sections.

get_config_section

Get settings for a specific configuration section.

search_config_options

Search for configuration options by keyword.

dag_graph

Show the task graph for a DAG.

dag_code

Get the source code of a DAG.

list_event_logs

List event logs for DAGs and tasks.

get_event_log

Get a specific event log by ID.

all_dag_event_summary

Show a summary of event counts across DAGs.

list_import_errors

List import errors with IDs.

get_import_error

Get a specific import error by ID.

all_dag_import_summary

Show a summary of import errors across DAGs.

dag_run_duration

Get run duration statistics for a DAG.

dag_task_duration

Show durations for the latest DAG run.

dag_calendar

Get calendar information and schedule for DAGs.

list_assets

Show all assets registered for data-aware scheduling (API v2).

list_asset_events

Show asset-related events.

get_your_custom_analysis

Your custom data analysis tool for domain-specific insights.