home / skills / jeremylongshore / claude-code-plugins-plus-skills / airflow-dag-generator
This skill guides you through airflow dag generator tasks, generating production-ready configurations and validating results for reliable data pipelines.
npx playbooks add skill jeremylongshore/claude-code-plugins-plus-skills --skill airflow-dag-generatorReview the files below or copy the command above to add this skill to your agents.
---
name: "airflow-dag-generator"
description: |
Generate airflow dag generator operations. Auto-activating skill for Data Pipelines.
Triggers on: airflow dag generator, airflow dag generator
Part of the Data Pipelines skill category. Use when working with airflow dag generator functionality. Trigger with phrases like "airflow dag generator", "airflow generator", "airflow".
allowed-tools: "Read, Write, Edit, Bash(cmd:*), Grep"
version: 1.0.0
license: MIT
author: "Jeremy Longshore <[email protected]>"
---
# Airflow Dag Generator
## Overview
This skill provides automated assistance for airflow dag generator tasks within the Data Pipelines domain.
## When to Use
This skill activates automatically when you:
- Mention "airflow dag generator" in your request
- Ask about airflow dag generator patterns or best practices
- Need help with data pipeline skills covering etl, data transformation, workflow orchestration, and streaming data processing.
## Instructions
1. Provides step-by-step guidance for airflow dag generator
2. Follows industry best practices and patterns
3. Generates production-ready code and configurations
4. Validates outputs against common standards
## Examples
**Example: Basic Usage**
Request: "Help me with airflow dag generator"
Result: Provides step-by-step guidance and generates appropriate configurations
## Prerequisites
- Relevant development environment configured
- Access to necessary tools and services
- Basic understanding of data pipelines concepts
## Output
- Generated configurations and code
- Best practice recommendations
- Validation results
## Error Handling
| Error | Cause | Solution |
|-------|-------|----------|
| Configuration invalid | Missing required fields | Check documentation for required parameters |
| Tool not found | Dependency not installed | Install required tools per prerequisites |
| Permission denied | Insufficient access | Verify credentials and permissions |
## Resources
- Official documentation for related tools
- Best practices guides
- Community examples and tutorials
## Related Skills
Part of the **Data Pipelines** skill category.
Tags: etl, airflow, spark, streaming, data-engineering
This skill accelerates creation and validation of Apache Airflow DAGs for data pipelines. It generates production-ready DAG code, configurations, and step-by-step guidance tailored to ETL, transformation, and streaming workflows. It activates when you mention airflow dag generator or similar triggers to streamline orchestration work.
The skill inspects your pipeline requirements (tasks, schedules, dependencies, operators, and resource constraints) and outputs Python DAG code, Docker/Kubernetes snippets, and configuration files. It applies common patterns and best practices, runs basic validation checks against style and syntactic issues, and suggests runtime and security configurations. You can iterate on generated DAGs with incremental prompts to refine operators, retries, and monitoring hooks.
What inputs do I need to provide?
Provide task descriptions, data sources/destinations, schedule, preferred operators (e.g., BashOperator, SparkSubmitOperator), and resource constraints. Minimal prompts can produce a scaffold to refine.
Can it generate DAGs for KubernetesExecutor or Celery?
Yes. Specify the executor and runtime environment and the skill will tailor operator configurations, pod templates, and resource requests accordingly.