Al Ain, Abu Dhabi, United Arab Emirates

Data Engineer

 Job Description:

Role summary
Build robust, scalable data pipelines and platforms that deliver clean, reliable, and timely data for analytics and AI.

Key responsibilities

  • Design ELT or ETL pipelines and orchestrate with Airflow or Prefect.
  • Model data for analytics, implement CDC, and manage schemas.
  • Add data quality tests, lineage, and cataloging.
  • Tune warehouses for performance and cost.
  • Secure data with role-based access and compliance controls.

Minimum qualifications

  • 3+ years in data engineering with strong SQL and Python.
  • Experience with BigQuery or Snowflake or Redshift and an orchestrator.
  • Version control and CI for data.
  • Basic understanding of dbt or similar transformation frameworks.

Preferred

  • Streaming with Kafka or pub/sub.
  • Terraform and cloud data services.
  • Great Expectations or similar testing frameworks.

Tools and stack
Airflow or Prefect, dbt, Kafka or Pub/Sub, Python, SQL, Terraform, Great Expectations, BigQuery or Snowflake or Redshift, Looker or Power BI as downstream.

KPIs / success metrics
Pipeline SLA and reliability, data quality scores, cost per terabyte, time to data availability.

Initial screening questions

  • How would you design CDC from a high-volume OLTP source into a warehouse?
  • How do you validate models and prevent schema drift?
  • Show how you balance cost and performance for a slowly changing dimension.

Optional skills task
Build an ELT pipeline with dbt models and tests on a sample dataset, then generate lineage docs.

Career path
Senior Data Engineer Staff Data Engineer Platform or Data Architecture Lead.

Job Type: Full-time

Work Location: On the road