home / skills / jeremylongshore / claude-code-plugins-plus-skills / airflow-operator-creator

This skill guides you through airflow operator creator tasks with step-by-step guidance and production-ready configurations.

npx playbooks add skill jeremylongshore/claude-code-plugins-plus-skills --skill airflow-operator-creator

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
2.2 KB
---
name: "airflow-operator-creator"
description: |
  Create airflow operator creator operations. Auto-activating skill for Data Pipelines.
  Triggers on: airflow operator creator, airflow operator creator
  Part of the Data Pipelines skill category. Use when working with airflow operator creator functionality. Trigger with phrases like "airflow operator creator", "airflow creator", "airflow".
allowed-tools: "Read, Write, Edit, Bash(cmd:*), Grep"
version: 1.0.0
license: MIT
author: "Jeremy Longshore <[email protected]>"
---

# Airflow Operator Creator

## Overview

This skill provides automated assistance for airflow operator creator tasks within the Data Pipelines domain.

## When to Use

This skill activates automatically when you:
- Mention "airflow operator creator" in your request
- Ask about airflow operator creator patterns or best practices
- Need help with data pipeline skills covering etl, data transformation, workflow orchestration, and streaming data processing.

## Instructions

1. Provides step-by-step guidance for airflow operator creator
2. Follows industry best practices and patterns
3. Generates production-ready code and configurations
4. Validates outputs against common standards

## Examples

**Example: Basic Usage**
Request: "Help me with airflow operator creator"
Result: Provides step-by-step guidance and generates appropriate configurations


## Prerequisites

- Relevant development environment configured
- Access to necessary tools and services
- Basic understanding of data pipelines concepts


## Output

- Generated configurations and code
- Best practice recommendations
- Validation results


## Error Handling

| Error | Cause | Solution |
|-------|-------|----------|
| Configuration invalid | Missing required fields | Check documentation for required parameters |
| Tool not found | Dependency not installed | Install required tools per prerequisites |
| Permission denied | Insufficient access | Verify credentials and permissions |


## Resources

- Official documentation for related tools
- Best practices guides
- Community examples and tutorials

## Related Skills

Part of the **Data Pipelines** skill category.
Tags: etl, airflow, spark, streaming, data-engineering

Overview

This skill automates creation and guidance for Airflow operator development inside data pipelines. It produces step-by-step instructions, production-ready operator code, and configuration fragments tailored to your environment. The skill is auto-activating for queries that mention airflow operator creator and focuses on practical outcomes for ETL, transformation, and orchestration tasks.

How this skill works

When triggered, the skill inspects the requested operator use case, target Airflow version, and runtime context (e.g., Kubernetes, local executor). It generates operator scaffolding, example DAG snippets, and configuration checks that follow common Airflow patterns and safety constraints. The output includes validation notes and actionable remediation steps for common errors.

When to use it

  • You need a custom Airflow operator or want to adapt an existing one to your environment.
  • You want production-ready operator code and DAG examples that follow best practices.
  • You need quick troubleshooting for operator configuration, permissions, or dependency issues.
  • You are designing operators for ETL, streaming, or cross-system orchestration.
  • You want validation and remediation advice for Airflow configuration errors.

Best practices

  • Target a specific Airflow version and executor to ensure compatibility.
  • Keep operators focused and idempotent; push heavy work to tasks, not DAG logic.
  • Use connection hooks and XCom for secrets and small payloads; avoid embedding secrets in code.
  • Include unit tests and integration smoke tests for operators and DAGs before deployment.
  • Validate permissions and runtime dependencies (Python packages, system tools) as part of CI.

Example use cases

  • Generate a PythonOperator scaffold that wraps a packaged library function with retry and logging.
  • Create a custom operator that triggers and polls an external API until completion.
  • Produce DAG snippets that deploy tasks to KubernetesPodOperator with resource requests and tolerations.
  • Diagnose a permission denied error when an operator tries to access cloud storage and propose fixes.
  • Convert a legacy BashOperator workflow into modular PythonOperators with clear error handling.

FAQ

Which Airflow versions does this target?

Specify your Airflow version when requesting code; the skill adapts patterns for common versions and executors.

Can it generate KubernetesPodOperator configurations?

Yes. Provide your cluster constraints and image details and it will produce pod specs, resource limits, and RBAC hints.