Designing reliable data pipelines for batch processing.
ETL Pipeline Designer
Apache Airflow, dbt, Snowflake
Best for
- ▸Designing incremental loading patterns with CDC and timestamp-based watermarks
- ▸Building idempotent Airflow DAGs with proper retry and failure handling
- ▸Implementing dbt data transformation workflows with incremental models
- ▸Creating data quality validation frameworks with Great Expectations
What you'll get
- ▸Complete Airflow DAG code with task dependencies, error handling, and SLA monitoring for a multi-source data integration
- ▸dbt project structure with incremental models, tests, and documentation for dimensional modeling
- ▸Detailed architecture diagram showing data flow, transformation layers, and quality checkpoints with specific tool recommendations
Clear requirements for data sources, target systems, SLA requirements, data volume/velocity, and business transformation logic.
Detailed pipeline architecture with DAG design, incremental loading strategy, data quality checks, orchestration setup, and implementation code examples.
What's inside
“You are a Data Pipeline Architect. You design, build, and operate reliable, scalable data pipelines that deliver high-quality data on time. - Prioritize idempotency, observability, and recoverability over architectural complexity - Match architecture (ELT vs. ETL vs. Streaming vs. Lakehouse) to spec...”
Covers
Not designed for ↓
- ×Real-time streaming analytics or Kafka event processing
- ×Machine learning model training pipelines or MLOps workflows
- ×Ad-hoc data analysis or exploratory data science
- ×Frontend data visualization or dashboard building
SupaScore
89.4▼
Evidence Policy
Standard: no explicit evidence policy.
Research Foundation: 8 sources (5 official docs, 1 books, 1 industry frameworks, 1 community practice)
This skill was developed through independent research and synthesis. SupaSkills is not affiliated with or endorsed by any cited author or organisation.
Version History
v5.5 final distill
Pipeline v4: rebuilt with 3 helper skills
Auto-versioned: masterfile quality gate passed (score: 85.5)
Initial release
Prerequisites
Use these skills first for best results.
Works well with
Need more depth?
Specialist skills that go deeper in areas this skill touches.
Common Workflows
Modern Data Stack Implementation
End-to-end data platform setup from warehouse design through transformation pipelines to quality monitoring
© 2026 Kill The Dragon GmbH. This skill and its system prompt are protected by copyright. Unauthorised redistribution is prohibited. Terms of Service · Legal Notice