About the job BI Engineer III
Permanent (Hybrid working model)
A large financial services company
Guided by senior developers you will be responsible for establishing new technology components and reusable data pipeline solutions that can be leveraged by business facing development teams in their day-to-day solutions.
Development to be done by following set principles, standards, processes, procedures, and guidelines from the wider BI community.
You should be able to communicate technical information to technical teams, as well as be competent in communicating challenges and solutions to your leadership team.
DESCRIPTION OF POSITION:
- Define a structured approach to problem solving and delivery against it.
- Load large, complex data sets to and make data available for other data engineers
- Working with other data engineers and data modelers, you will design, implement, and manage data vaults and data transformations and the data pipeline
- Monitor and fine-tune data vaults and data transformations on the Cloudera Hadoop stack
- Use modern development and modelling techniques and tools to implement BI and data management solutions, including data quality, meta data, and reference data
- Engage with a wide range of technical stakeholders including data scientists, data analysts, business analysts, other data engineers and solutions architects
KNOWLEDGE AND SKILLS:
- Knowledge and Experience in Data Warehousing and the ETL processes
- Knowledge and proven experience on Data Stage at an Intermediate level or higher with at least 2 years working experience at that level
- Application and data engineering background with a solid background in SQL is required
- Knowledge of database management system (DBMS) physical implementation, including tables, joins and SQL querying.
- Knowledge and experience of structured data, such as entities, classes, hierarchies, relationships, and metadata
- Strong Data Engineering background with a specific focus on staging high quality data
- Understanding of data warehousing principles (e.g. Kimball or Data Vault).
- A solid background in SQL and ETL procedures
- Bachelors degree in Computer Science, Statistics, Informatics, Information Systems, Engineering or another quantitative field
- National Diploma in an Information Technology related discipline
- The Data Engineer must have at least 3 - 7 years relevant experience in a similar environment
- Data Management technologies (e.g. Informatica Data Quality (IDQ), Informatica Enterprise Data Catalog (EDC), Axon, EBX)
- Data warehousing (Kimball or Data Vault patterns) and dimensional data modelling (e.g. OLAP and MDX experience)
- Experience in developing data pipelines using ETL tools (e.g. SAP Data Services)
- Experience in agile development
- SAP HANA, DB2, SQL, Business Objects and Linux scripting is required
- Automation (e.g. Wherescape), scheduling and test automation (e.g. Robot framework)
- Experience in Database technologies (e.g. SAP Hana, or similar) or Hadoop components including HDFS, Hive, Spark, Oozie and Impala preferred and highly advantageous.
**Please note : If you have not heard from us within 2 weeks, please consider your application unsuccessful.