← Back to Skills
Data & AnalyticsTechnologyPlatinum

Designing reliable data pipelines for batch processing.

ETL Pipeline Designer

Apache Airflow, dbt, Snowflake

advancedv5.0

Best for

  • Designing incremental loading patterns with CDC and timestamp-based watermarks
  • Building idempotent Airflow DAGs with proper retry and failure handling
  • Implementing dbt data transformation workflows with incremental models
  • Creating data quality validation frameworks with Great Expectations

What you'll get

  • Complete Airflow DAG code with task dependencies, error handling, and SLA monitoring for a multi-source data integration
  • dbt project structure with incremental models, tests, and documentation for dimensional modeling
  • Detailed architecture diagram showing data flow, transformation layers, and quality checkpoints with specific tool recommendations
Expects

Clear requirements for data sources, target systems, SLA requirements, data volume/velocity, and business transformation logic.

Returns

Detailed pipeline architecture with DAG design, incremental loading strategy, data quality checks, orchestration setup, and implementation code examples.

What's inside

You are a Data Pipeline Architect. You design, build, and operate reliable, scalable data pipelines that deliver high-quality data on time. - Prioritize idempotency, observability, and recoverability over architectural complexity - Match architecture (ELT vs. ETL vs. Streaming vs. Lakehouse) to spec...

Covers

What You Do DifferentlyMethodologyWatch For
Not designed for ↓
  • ×Real-time streaming analytics or Kafka event processing
  • ×Machine learning model training pipelines or MLOps workflows
  • ×Ad-hoc data analysis or exploratory data science
  • ×Frontend data visualization or dashboard building

SupaScore

89.4
Research Quality (15%)
9.1
Prompt Engineering (25%)
8.95
Practical Utility (15%)
8.8
Completeness (10%)
8.9
User Satisfaction (20%)
9
Decision Usefulness (15%)
8.85

Evidence Policy

Standard: no explicit evidence policy.

etleltdata-pipelineairflowdbtdata-transformationincremental-loadingcdcdata-lineageidempotentdata-qualityorchestration

Research Foundation: 8 sources (5 official docs, 1 books, 1 industry frameworks, 1 community practice)

This skill was developed through independent research and synthesis. SupaSkills is not affiliated with or endorsed by any cited author or organisation.

Version History

v5.03/25/2026

v5.5 final distill

v2.02/22/2026

Pipeline v4: rebuilt with 3 helper skills

v1.0.12/15/2026

Auto-versioned: masterfile quality gate passed (score: 85.5)

v1.0.02/15/2026

Initial release

Prerequisites

Use these skills first for best results.

Works well with

Need more depth?

Specialist skills that go deeper in areas this skill touches.

Common Workflows

Modern Data Stack Implementation

End-to-end data platform setup from warehouse design through transformation pipelines to quality monitoring

© 2026 Kill The Dragon GmbH. This skill and its system prompt are protected by copyright. Unauthorised redistribution is prohibited. Terms of Service · Legal Notice