Job Openings AWS Data Engineer (Contract) - Gauteng/Hybrid - ISB2212050

About the job AWS Data Engineer (Contract) - Gauteng/Hybrid - ISB2212050

Our client requires the services of a Data Engineer/Scientist (Expert) - Midrand/Menlyn/Rosslyn/Home Office rotation.

  • Amazing brand with cutting-edge technology
  • Excellent teams in Global team collaboration
  • High work-life balance with Flexible hours
  • Agile working environment

POSITION: Contract until December 2026

EXPERIENCE: 8+ Years related working experience.

COMMENCEMENT: As soon as possible

QUALIFICATIONS/EXPERIENCE

  • South African citizens/residents are preferred.
  • Relevant IT / Business / Engineering Degree
  • Certifications:
  • Candidates with one or more of the certifications are preferred.
  • AWS Certified Cloud Practitioner
  • AWS Certified SysOps Associate
  • AWS Certified Developer Associate
  • AWS Certified Architect Associate
  • AWS Certified Architect Professional
  • Hashicorp Certified Terraform Associate

ESSENTIAL SKILLS:

  • Above Average experience/understanding (in order of importance):
  • Terraform
  • Python 3x
  • SQL - Oracle/PostgreSQL
  • Py Spark
  • Boto3
  • ETL
  • Docker
  • Linux / Unix
  • Big Data
  • Powershell / Bash
  • GROUP Cloud Data Hub (CDH)
  • GROUP CDEC Blueprint
  • Experience in working with Enterprise Collaboration tools such as Confluence, JIRA.
  • Experience developing technical documentation and artefacts.
  • Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV.
  • Experience working with Data Quality Tools such as Great Expectations.
  • Knowledge of the Agile Working Model.
  • Any additional responsibilities assigned in the Agile Working Model (AWM) Charter

ADVANTAGEOUS TECHNICAL SKILLS

  • Demonstrate expertise in data modelling Oracle SQL.
  • Exceptional analytical skills analysing large and complex data sets.
  • Perform thorough testing and data validation to ensure the accuracy of data transformations.
  • Strong written and verbal communication skills, with precise documentation.
  • Self-driven team player with ability to work independently and multi-task.
  • Experience building data pipeline using AWS Glue or Data Pipeline, or similar platforms.
  • Familiar with data store such as AWS S3, and AWS RDS or DynamoDB.
  • Experience and solid understanding of various software design patterns.
  • Experience preparing specifications from which programs will be written, designed, coded, tested and debugged.
  • Strong organizational skills. 
  • Experience developing and working with REST API's is a bonus.
  • Basic experience in Networking and troubleshooting network issues.
  • Basic experience/understanding of AWS Components (in order of importance):
  • Glue
  • CloudWatch
  • SNS
  • Athena
  • S3
  • Kinesis Streams (Kinesis, Kinesis Firehose)
  • Lambda
  • DynamoDB
  • Step Function
  • Param Store
  • Secrets Manager
  • Code Build/Pipeline
  • CloudFormation
  • Business Intelligence (BI) Experience
  • Technical data modelling and schema design (not drag and drop)
  • Kafka
  • AWS EMR
  • Redshift

ROLE:

  • Data Engineers are responsible for building and maintaining Big Data Pipelines using GROUP Data Platforms.
  • Data Engineers are custodians of data and must ensure that data is shared in line with the information classification requirements on a need-to-know basis.

NB: By applying for this role, you consent to be added to the iSanqa database and to receive updates until you unsubscribe.
Also note, that if you have not received a response from us within 2 weeks, your application was unsuccessful.

#isanqa #isanqajobs #AWSCloudtechnology #Enterprisedata #Platforms #Purchasing #Finance #Sales #Agileworking #FuelledbyPassionIntegrityExcellence