Job Openings Data Engineer

About the job Data Engineer

Job Summary:

We are seeking a highly skilled and motivated Data Engineer to join our lean and dynamic team working on the various major projects. The ideal candidate will have a strong background in data integration, ETL pipelines, and data quality management, playing a key role in building a robust data foundation that powers real-time executive dashboards and AI-driven insights. This is an exciting opportunity to work on a high-visibility project leveraging Microsoft Fabric, Power BI, and other modern data technologies in a mission-critical, enterprise-level environment.

Key Skills Required:

  • ETL pipeline development (Fabric Data Factory, Power Query, SQL, etc.)
  • Data integration from multiple enterprise systems (APIs, direct connections, file-based)
  • Data quality checks and cleansing
  • Data modelling and transformation best practices
  • Familiarity with Power BI data sources and performance optimization
  • Strong understanding of data governance and security standards
  • Excellent problem-solving and communication skills
  • Ability to work collaboratively in an Agile environment

Minimum Experience:

5+ years of hands-on experience as a Data Engineer in enterprise data integration projects.

Proven experience integrating complex datasets from multiple sources and building scalable ETL pipelines.

Roles and Responsibilities:

  • Design and build ETL pipelines to integrate data from multiple enterprise systems (e.g., Oracle Fusion, Salesforce, HRMS etc.) and external APIs into a unified data platform.
  • Develop and manage data transformation logic to ensure consistency, accuracy, and usability of data for Power BI dashboards and AI/ML use cases.
  • Collaborate closely with the BI Developer and AI/ML Specialist to ensure seamless data availability and high-performance reporting.
  • Conduct data profiling and cleansing to maintain high data quality standards.
  • Design a data warehouse data model based on the business requirements.
  • Optimize data pipeline performance to support near real-time updates for executive dashboards.
  • Design, develop, and test both batch and real-time Extract, Load and Transform (ELT) processes required for the data integration.
  • Optimize ELT processes to ensure execution time is meeting the requirements.
  • Manage Ingest of both structured and unstructured data into DAMAC's data lake/data warehouse system.
  • Assess the data quality of the source systems and propose required enhancements to achieve a satisfying level of data accuracy.
  • Document data flows, transformation rules, and best practices for ongoing reference and governance.
  • Implement security and privacy standards in data handling, in line with industry's compliance requirements.
  • Participate in project planning, sprint reviews, and daily stand-ups as part of the Agile project team.
  • Proactively identify and resolve data discrepancies, ensuring data reliability and trustworthiness.
  • Contribute to the continuous improvement of data engineering practices and adoption of new technologies where applicable.

Policies, Systems, Processes & Procedures

  • Follow all relevant departmental policies, processes, standard operating procedures, and instructions so that work is carried out in a controlled and consistent manner.
  • Always demonstrate compliance to the organizations values and ethics to support the establishment of a value-driven culture within the organization.

Continuous Improvement

  • Contribute to the identification of opportunities for continuous improvement and sustainability of systems, processes, and practices considering global standards and productivity improvement.

Reporting

  • Assist in the preparation of timely and accurate statements and reports to meet department requirements, policies, and quality standards.

Minimum Qualifications:

  • Bachelor's degree in computer science, Data Engineering, or a related field.
  • Experience with Microsoft Fabric (Data Factory, OneLake, Direct Lake), Power BI integration.
  • Familiarity with Delta Lake architecture or similar.
  • Exposure to AI/ML pipeline integration is a plus.

Remuneration:

Attractive package with performance-based incentives and career growth opportunities.

How to Apply:

Interested candidates should send their CV and a brief cover note outlining their relevant experience to hr@iq-data.com. We thank all applicants for their interest; however, only those selected for an interview will be contacted.