Job Openings AWS Data Engineer (Contract) - Gauteng/Hybrid - ISB2211154

About the job AWS Data Engineer (Contract) - Gauteng/Hybrid - ISB2211154

Our client requires the services of a Data Engineer/Scientist (Senior) - Midrand/Menlyn/Rosslyn/Home Office rotation.

  • Amazing brand with cutting-edge technology
  • Excellent teams in Global team collaboration
  • High work-life balance with Flexible hours
  • Agile working environment

POSITION: Contract until December 2026

EXPERIENCE: 6-8 Years related working experience.

COMMENCEMENT: As soon as possible

QUALIFICATIONS/EXPERIENCE

  • South African citizens/residents are preferred.
  • Relevant IT / Business / Engineering Degree
  • Candidates with one or more of the certifications are preferred.
  • AWS Certified Cloud Practitioner
  • AWS Certified SysOps Associate
  • AWS Certified Developer Associate
  • AWS Certified Architect Associate
  • AWS Certified Architect Professional
  • Hashicorp Certified Terraform Associate


ESSENTIAL SKILLS:

  • Terraform
  • Python 3x
  • SQL - Oracle/PostgreSQL
  • Py Spark
  • Boto3
  • ETL
  • Docker
  • Linux / Unix
  • Big Data
  • Powershell / Bash


ADVANTAGEOUS TECHNICAL SKILLS

  • Demonstrate expertise in data modelling Oracle SQL.
    • Exceptional analytical skills analysing large and complex data sets.
    • Perform thorough testing and data validation to ensure the accuracy of data transformations.
    • Strong written and verbal communication skills, with precise documentation.
    • Self-driven team player with ability to work independently and multi-task.
    • Experience in working with Enterprise Collaboration tools such as Confluence, JIRA etc.
  • GROUP Cloud Data Hub (CDH)
  • GROUP CDEC Blueprint
  • Experience developing technical documentation and artefacts.
  • Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc.
  • Experience working with Data Quality Tools such as Great Expectations.
  • Experience developing and working with REST API's is a bonus.
  • Experience building data pipeline using AWS Glue or Data Pipeline, or similar platforms.
  • Familiar with data store such as AWS S3, and AWS RDS or DynamoDB
  • Experience and solid understanding of various software design patterns.
  • Experience preparing specifications from which programs will be written, designed, coded, tested and debugged.
  • Strong organizational skills.


Basic experience/understanding of AWS Components (in order of importance):

  • Glue
  • CloudWatch
  • SNS
  • Athena
  • S3
  • Kinesis Streams (Kinesis, Kinesis Firehose)
  • Lambda
  • DynamoDB
  • Step Function
  • Param Store
  • Secrets Manager
  • Code Build/Pipeline
  • CloudFormation
  • Business Intelligence (BI) Experience
  • Technical data modelling and schema design (not drag and drop)
  • Kafka
  • AWS EMR
  • Redshift
  • Basic experience in Networking and troubleshooting network issues.
  • Knowledge of the Agile Working Model


ROLE:

  • Data Engineers are responsible for building and maintaining Big Data Pipelines using GROUP Data Platforms.
  • Data Engineers are custodians of data and must ensure that data is shared in line with the information classification requirements on a need-to-know basis.

NB: By applying for this role, you consent to be added to the iSanqa database and to receive updates until you unsubscribe.
Also note, that if you have not received a response from us within 2 weeks, your application was unsuccessful.

#isanqa #isanqajobs #DataScientist #Dataengineer #Terraform #Python #SQL #FuelledbyPassionIntegrityExcellence