Job Openings Senior Data Engineer - Enterprise Data Warehouse & Lakehouse Solutions (Information Technology/Software/Onsite)

About the job Senior Data Engineer - Enterprise Data Warehouse & Lakehouse Solutions (Information Technology/Software/Onsite)

Our client, a leading Software, Information and Communication Technologies company, operates internationally (Athens, Brussels, Luxembourg, Copenhagen, Stockholm, London, Nicosia, Hong-Kong, Valetta, etc). Our client is a renowned supplier of IT services to government institutions, multinational corporations, public administrations and multinational companies, research and academic institutes.

   

Role Overview

Our client currently has a vacancy for a Senior Data Engineer - Enterprise Data Warehouse & Lakehouse Solutions fluent in English, to offer his/her services as an expert who will be based in Belgium. The work will be carried out either in the company’s premises or on-site at customer premises. In the context of the first assignment, the successful candidate will be integrated with the Development team of the company that will closely cooperate with a major client’s IT team on site.

  

Job Type: Full-Time/Permanent

Location: Brussels

Workplace: Onsite

  

Requirements

  • University degree in IT combined with relevant IT professional experience of 13 years;
  • At least 5 years of experience in relational database systems applied to data warehouse, data warehouse design & architecture;
  • At least 5 years of experience in code-based data transformation tools such as Data build tool (dbt), Spark;
  • At least 5 years of experience in SQL and data integration and ETL/ELT tools;
  • Hands-on experience as Data Engineer in a modern data platform and on data analytics techniques and tools;
  • At least 3 years of experience in Python and orchestration tools such as Airflow, Dagster;
  • At least 3 years of experience in data modelling tools as well as online analytical data processing (OLAP) and data mining tools;
  • Experience with data platforms such as Fabric, Talend, Databricks and Snowflake;
  • Experience with containerised application development and deployment tools, such as Docker, Podman, Kubernetes;
  • Excellent command of the English language.

  

Responsibilities

  • Development and maintenance on Enterprise Data Warehouses (EDW) and complex Business Intelligence Solutions (Data Lakes / Data Lakehouses);
  • Design and development of data pipelines for scalable and reliable data workflows to transform extensive quantities of both structured and unstructured data;
  • Data integration from various sources, including relational databases, APIs, data streaming services and cloud data platforms;
  • Optimisation of queries and workflows for increased performance and enhanced efficiency;
  • Writing modular, testable and production-grade code;
  • Ensuring data quality through monitoring, validation and data quality checks, maintaining accuracy and consistency across the data platform;
  • Elaboration of test programs;
  • Document processes comprehensively to ensure seamless data pipeline management and troubleshooting;
  • Assistance with deployment and configuration of the system.