Job Openings
AWS Data Engineer (Contract) - Gauteng/Hybrid - ISB2312517
About the job AWS Data Engineer (Contract) - Gauteng/Hybrid - ISB2312517
Our client requires the services of a Data Engineer/Scientist (Senior) Midrand/Menlyn/Rosslyn/Home Office Rotation
- Amazing brand with cutting-edge technology
- Excellent teams in Global team collaboration
- High work-life balance with Flexible hours
- Agile working environment
POSITION: Contract until December 2026
EXPERIENCE: 6-8 Years related working experience.
COMMENCEMENT: As soon as possible
QUALIFICATIONS/EXPERIENCE
- AWS Certified Cloud Practitioner,
- AWS Certified SysOps Associate,
- AWS Certified Developer Associate,
- AWS Certified Architect Associate,
- AWS Certified Architect Professional,
- Hashicorp Certified Terraform Associate
ESSENTIAL SKILLS:
- Terraform
- Python 3x
- SQL - Oracle/PostgreSQL
- Py Spark
- Boto3
- ETL
- Docker
- Linux / Unix
- Big Data
- Powershell / Bash
- Experience in working with Enterprise Collaboration tools such as Confluence, JIRA etc.
- Experience developing technical documentation and artefacts.
- Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc.
- Experience working with Data Quality Tools such as Great Expectations.
- Experience developing and working with REST API's is a bonus.
- Basic experience in Networking and troubleshooting network issues.
- Any additional responsibilities assigned In the Agile Working Model (AWM) Charter
ADVANTAGEOUS TECHNICAL SKILLS
- Exceptional analytical skills analysing large and complex data sets.
- Perform thorough testing and data validation to ensure the accuracy of data transformations.
- Strong written and verbal communication skills, with precise documentation.
- Self-driven team player with ability to work independently and multi-task.
- Experience building data pipeline using AWS Glue or Data Pipeline, or similar platforms.
- Familiar with data store such as AWS S3, and AWS RDS or DynamoDB.
- Experience and solid understanding of various software design patterns.
- Experience preparing specifications from which programs will be written, designed, coded, tested and debugged.
- Strong organizational skills.
- Advanced experience/understanding of AWS Components (in order of importance):
- Glue
- CloudWatch
- SNS
- Athena
- S3
- Kinesis Streams (Kinesis, Kinesis Firehose)
- Lambda
- DynamoDB
- Step Function
- Param Store
- Code Build/Pipeline
- CloudFormation
- Technical data modelling and schema design (not drag and drop)
- Kafka
- AWS EMR
- Redshift
ROLE:
- Data Engineers are responsible for building and maintaining Big Data Pipelines using GROUP Data Platforms.
- Data Engineers are custodians of data and must ensure that data is shared in line with the information classification requirements on a need-to-know basis.
NB: By applying for this role, you consent to be added to the iSanqa database and to receive updates until you unsubscribe.
Also note, that if you have not received a response from us within 2 weeks, your application was unsuccessful.
#isanqa #isanqajobs #AWSDataEngineer #BigData #Terraform #Python3x #SQL #Softwareengineer #AWSCloud #Agileworking #FuelledbyPassionIntegrityExcellence