home / mcp / mcp-airflow-api mcp server
๐Model Context Protocol (MCP) server for Apache Airflow API integration. Provides comprehensive tools for managing Airflow clusters including service operations, configuration management, status monitoring, and request tracking.
Configuration
View docs{
"mcpServers": {
"call518-mcp-airflow-api": {
"url": "http://localhost:8000/mcp",
"headers": {
"REMOTE_AUTH_ENABLE": "true",
"AIRFLOW_API_VERSION": "v2",
"AIRFLOW_API_BASE_URL": "http://localhost:8080/api",
"AIRFLOW_API_PASSWORD": "airflow",
"AIRFLOW_API_USERNAME": "airflow"
}
}
}
}MCP-Airflow-API lets you manage Apache Airflow clusters through natural language by using the Model Context Protocol. You interact with a single MCP server that loads the correct Airflow API toolset and translates your natural language requests into Airflow REST API actions, enabling intuitive workflow management without writing REST queries.
You connect an MCP client to the MCP-Airflow-API server and start issuing natural language requests. For example, you can ask to list running DAGs, trigger a DAG, or inspect task instances. The server chooses the appropriate API version based on configuration and runs the corresponding toolset. Use a compatible MCP client configuration to point to either a local or remote MCP server. When authentication is enabled for remote access, include the Bearer token in your client requests.
Prerequisites: you need Python for the server, and a client capable of MCP communication. You also may want Docker and Docker Compose for a complete demo environment.
Step-by-step setup using a local development workflow and a ready-made demo environment:
git clone https://github.com/call518/MCP-Airflow-API.git
cd MCP-Airflow-API
# Optional: install dependencies and run in development mode
pip install -e .
# Run in stdio mode
python -m mcp_airflow_api
# For a Docker-based quickstart, you can use the companion demo environment
# See the Quickstart section for details on Docker Compose setup.Security, configuration, and advanced usage details are provided to help you run MCP-Airflow-API safely in development and production. You can enable Bearer token authentication for remote access, switch API versions via environment variables, and configure multiple Airflow clusters if needed. The server exposes a web UI and an API endpoint, and you can access documentation and status endpoints once the server is running.
The MCP server supports two transport modes: stdio for local usage and streamable-http for Docker or remote deployment. You control the mode with environment variables and can enable Bearer token authentication for remote access. Always enable authentication in production when using remote access.
Common environment variables you will use include API version, base URL, and credentials for Airflow. You can also set the MCP port and logging level. For security, use strong secret keys and enable HTTPS when possible behind a reverse proxy.
If you cannot connect, verify that the MCP server is running, the configured port is correct, and the network allows access. Check logs for authentication errors if you are using streamable-http with Bearer tokens. Ensure the Airflow API base URL and credentials are reachable from the MCP server.
Use cases include asking for current DAGs, monitoring cluster health, inspecting task durations, and querying configuration settings. You can tailor your queries to filter by DAG IDs, statuses, or time ranges, and use pagination to handle large environments.
List DAGs with optional filters and pagination to view current workload.
Get detailed information for multiple DAGs, including latest run data.
Show DAGs that are currently running.
Show DAGs with failed runs.
Trigger a specific DAG run.
Pause a DAG to stop scheduling.
Unpause a DAG to resume scheduling.
Check the health status of the Airflow cluster.
Retrieve Airflow version information.
List worker pools and their usage.
Get details for a specific pool.
List Airflow variables and their values.
Get the value of a specific Airflow variable.
List task instances for a given DAG, with optional filters.
List XCom entries for a task or DAG.
Get a specific XCom value by key.
Show Airflow configuration settings.
List all configuration sections.
Get settings for a specific configuration section.
Search for configuration options by keyword.
Show the task graph for a DAG.
Get the source code of a DAG.
List event logs for DAGs and tasks.
Get a specific event log by ID.
Show a summary of event counts across DAGs.
List import errors with IDs.
Get a specific import error by ID.
Show a summary of import errors across DAGs.
Get run duration statistics for a DAG.
Show durations for the latest DAG run.
Get calendar information and schedule for DAGs.
Show all assets registered for data-aware scheduling (API v2).
Show asset-related events.
Your custom data analysis tool for domain-specific insights.