Job Openings Data Engineer

About the job Data Engineer

Data Engineer (Intermediate & Senior) Contract

Location: South Africa (Hybrid Johannesburg preferred)
Contract: 6-12 Months (Renewable)
Industry: Banking / Financial Services
Positions: 1 × Intermediate (3-5 yrs), 1 × Senior (5-8 yrs)

About the Role

We are looking for experienced Data Engineers to join a leading banking team. The role involves designing and building data pipelines across multiple data platforms including Orion Hive, Data Lake/Reservoir, Teradata, and SQL environments.
You will work within a fast-paced financial services environment and collaborate with cross-functional data and analytics teams.

Key Responsibilities

  • Develop and maintain data pipelines from Orion Hive, Data Lake/Reservoir, and Teradata into SQL platforms.
  • Design and implement ETL/ELT processes for enterprise data integration.
  • Perform data modelling and optimisation for analytics and reporting.
  • Write clean, efficient code for data ingestion and transformation using C#/.NET.
  • Ensure end-to-end data quality, performance, and reliability.
  • Work with business and technical stakeholders to translate requirements into scalable data solutions.
  • Troubleshoot and resolve data pipeline issues across environments.
  • Contribute to continuous improvement of data engineering tools and best practices.

Minimum Requirements

Technical Skills

  • C# / .NET development experience (required)
  • Strong SQL skills (SQL Server / T-SQL)
  • Hands-on experience with Teradata
  • Experience building data pipelines in:
    • Orion Hive
    • Data Lake / Data Reservoir
  • Experience using Visual Studio
  • Solid understanding of data modelling, ETL/ELT concepts, and data integration patterns

Experience

  • Intermediate: 3-5 years
  • Senior: 5-8 years

Industry Advantage

  • Banking / Financial Services experience strongly preferred
  • Experience working in enterprise-scale data environments

Soft Skills

  • Strong analytical and problem-solving skills
  • Ability to work under pressure in a high-performance team
  • Clear communication and stakeholder engagement ability
  • Self-driven and solution-focused mindset

Nice to Have

  • Python for advanced ETL or automation
  • Cloud platform experience (Azure, AWS, GCP)
  • Exposure to big-data tooling & distributed systems
  • Data governance & metadata management understanding