← Back to Skills

Apache Airflow Engineer

Expert in Apache Airflow — DAG design, task orchestration, monitoring, scaling, ELT pipeline automation, and production operations with best practices for reliability.

Gold
v1.0.00 activationsData & AnalyticsTechnologyadvanced

SupaScore

84.9
Research Quality (15%)
8.3
Prompt Engineering (25%)
8.5
Practical Utility (15%)
8.8
Completeness (10%)
8.2
User Satisfaction (20%)
8.6
Decision Usefulness (15%)
8.4

Best for

  • Design and implement DAG architectures for ELT data pipelines orchestrating dbt, Spark, and cloud data warehouse transformations
  • Troubleshoot failed Airflow task dependencies, retry logic, and cross-DAG scheduling issues in production environments
  • Optimize Airflow deployment on Kubernetes with KubernetesPodOperator for isolated task execution and horizontal scaling
  • Implement monitoring, alerting, and SLA management for data pipeline orchestration with proper observability patterns
  • Migrate legacy cron jobs and ETL workflows to Airflow with proper idempotency and recovery mechanisms

What you'll get

  • Complete DAG Python file with TaskFlow API decorators, proper task dependencies, templated variables, and error handling configuration
  • Airflow deployment architecture diagram with scheduler, executor, and database components plus Kubernetes scaling configuration
  • Monitoring dashboard configuration with task success rates, SLA violations, and pipeline health metrics with alerting rules
Not designed for ↓
  • ×Heavy data processing or transformation execution (Airflow is an orchestrator, not a processing engine)
  • ×Real-time streaming data processing (Airflow is designed for batch workflows with scheduled intervals)
  • ×Simple single-step jobs that don't require orchestration or dependency management
  • ×Airflow infrastructure provisioning without understanding DAG design patterns and operational requirements
Expects

Specific data pipeline requirements including data sources, transformation steps, scheduling frequency, SLA requirements, and dependency relationships between tasks.

Returns

Complete DAG implementation with task definitions, dependency chains, error handling, monitoring configuration, and operational documentation following Airflow best practices.

Evidence Policy

Standard: no explicit evidence policy.

apache-airflowdagpipeline-orchestrationeltdata-engineeringworkflow-automation

Research Foundation: 6 sources (4 official docs, 1 web, 1 books)

This skill was developed through independent research and synthesis. SupaSkills is not affiliated with or endorsed by any cited author or organisation.

Version History

v1.0.02/15/2026

Initial version

Prerequisites

Use these skills first for best results.

Works well with

Need more depth?

Specialist skills that go deeper in areas this skill touches.

Common Workflows

Modern Data Stack Pipeline Implementation

End-to-end data pipeline design starting with architecture planning, implementing Airflow orchestration, configuring dbt transformations, and establishing monitoring

Activate this skill in Claude Code

Sign up for free to access the full system prompt via REST API or MCP.

Start Free to Activate This Skill

© 2026 Kill The Dragon GmbH. This skill and its system prompt are protected by copyright. Unauthorised redistribution is prohibited. Terms of Service · Legal Notice