← Back to Skills

Streaming ETL Architect

Design and build production-grade real-time data pipelines using Apache Kafka, Flink, Spark Streaming, and CDC patterns with exactly-once semantics and operational observability.

Gold
v1.0.00 activationsData & AnalyticsTechnologyexpert

SupaScore

84.6
Research Quality (15%)
8.5
Prompt Engineering (25%)
8.5
Practical Utility (15%)
8.5
Completeness (10%)
8.5
User Satisfaction (20%)
8.3
Decision Usefulness (15%)
8.5

Best for

  • Building real-time fraud detection pipelines with exactly-once processing guarantees
  • Implementing CDC from PostgreSQL/MySQL to data lake with sub-second latency
  • Designing multi-region Kafka clusters with cross-datacenter replication for financial data
  • Migrating batch ETL workflows to streaming with backfill reconciliation
  • Creating event-driven data mesh with schema evolution and backward compatibility

What you'll get

  • Detailed architecture diagram showing Kafka topics, Flink jobs, state backends, and sink connectors with specific configuration parameters
  • Step-by-step implementation guide including Docker Compose setup, schema registry configuration, and exactly-once semantics tuning
  • Comprehensive monitoring dashboard specification with SLIs/SLOs, alerting rules, and operational runbooks for common failure scenarios
Not designed for ↓
  • ×Simple batch ETL jobs that run daily or weekly
  • ×Basic Kafka producer/consumer applications without complex processing
  • ×One-off data migrations or ad-hoc data analysis
  • ×Small-scale data pipelines under 1000 events/second
Expects

Clear requirements for data sources, target sinks, latency SLAs, delivery guarantees, and scale (events/sec, data volume).

Returns

Complete streaming architecture design with technology selection, topology diagrams, configuration parameters, monitoring strategy, and operational runbooks.

Evidence Policy

Enabled: this skill cites sources and distinguishes evidence from opinion.

streamingetlkafkaflinkspark-streamingcdcdebeziumreal-timeevent-drivendata-pipelineexactly-oncestream-processingschema-registry

Research Foundation: 8 sources (3 official docs, 3 books, 1 paper, 1 web)

This skill was developed through independent research and synthesis. SupaSkills is not affiliated with or endorsed by any cited author or organisation.

Version History

v1.0.02/16/2026

Initial release

Prerequisites

Use these skills first for best results.

Works well with

Need more depth?

Specialist skills that go deeper in areas this skill touches.

Common Workflows

Real-time Analytics Platform Build

Complete pipeline from event design through streaming processing to real-time dashboards with full observability

Activate this skill in Claude Code

Sign up for free to access the full system prompt via REST API or MCP.

Start Free to Activate This Skill

© 2026 Kill The Dragon GmbH. This skill and its system prompt are protected by copyright. Unauthorised redistribution is prohibited. Terms of Service · Legal Notice