← Back to Skills
Data & AnalyticsTechnologyPlatinum

Automate and manage complex data workflows.

Apache Airflow Engineer

Apache Airflow, DAGs, Kubernetes

advancedv5.0

Best for

  • Design and implement DAG architectures for ELT data pipelines orchestrating dbt, Spark, and cloud data warehouse transformations
  • Troubleshoot failed Airflow task dependencies, retry logic, and cross-DAG scheduling issues in production environments
  • Optimize Airflow deployment on Kubernetes with KubernetesPodOperator for isolated task execution and horizontal scaling
  • Implement monitoring, alerting, and SLA management for data pipeline orchestration with proper observability patterns

What you'll get

  • Complete DAG Python file with TaskFlow API decorators, proper task dependencies, templated variables, and error handling configuration
  • Airflow deployment architecture diagram with scheduler, executor, and database components plus Kubernetes scaling configuration
  • Monitoring dashboard configuration with task success rates, SLA violations, and pipeline health metrics with alerting rules
Expects

Specific data pipeline requirements including data sources, transformation steps, scheduling frequency, SLA requirements, and dependency relationships between tasks.

Returns

Complete DAG implementation with task definitions, dependency chains, error handling, monitoring configuration, and operational documentation following Airflow best practices.

What's inside

You are an Apache Airflow Engineer. You design, deploy, and operate production-grade data pipeline orchestration with Apache Airflow, combining deep technical knowledge of Airflow's architecture with practical expertise in DAG design patterns, executor configuration, monitoring, and operational reli...

Covers

What You Do DifferentlyMethodologyWatch For
Not designed for ↓
  • ×Heavy data processing or transformation execution (Airflow is an orchestrator, not a processing engine)
  • ×Real-time streaming data processing (Airflow is designed for batch workflows with scheduled intervals)
  • ×Simple single-step jobs that don't require orchestration or dependency management
  • ×Airflow infrastructure provisioning without understanding DAG design patterns and operational requirements

SupaScore

89.03
Research Quality (15%)
9.1
Prompt Engineering (25%)
8.95
Practical Utility (15%)
8.65
Completeness (10%)
9.3
User Satisfaction (20%)
8.8
Decision Usefulness (15%)
8.75

Evidence Policy

Standard: no explicit evidence policy.

apache-airflowdagpipeline-orchestrationeltdata-engineeringworkflow-automation

Research Foundation: 6 sources (4 official docs, 1 web, 1 books)

This skill was developed through independent research and synthesis. SupaSkills is not affiliated with or endorsed by any cited author or organisation.

Version History

v5.03/25/2026

v5.5 final distill

v2.02/19/2026

Pipeline v4: rebuilt with 3 helper skills

v1.0.02/15/2026

Initial version

Prerequisites

Use these skills first for best results.

Works well with

Need more depth?

Specialist skills that go deeper in areas this skill touches.

Common Workflows

Modern Data Stack Pipeline Implementation

End-to-end data pipeline design starting with architecture planning, implementing Airflow orchestration, configuring dbt transformations, and establishing monitoring

© 2026 Kill The Dragon GmbH. This skill and its system prompt are protected by copyright. Unauthorised redistribution is prohibited. Terms of Service · Legal Notice