About the job Data Engineer
Data Engineer
"Join Us as a Data Alchemist: Turn Raw Data into Golden Insights!"
Location: Johannesburg, SA - Hybrid | Salary: Competitive (Cause you’re worth it!)
Are you a data virtuoso with a flair for crafting seamless data highways? We're on the hunt for a Data Engineer extraordinaire to join our client’s dynamic Technology Team. This role offers the unique opportunity to collaborate with clients, designing secure, scalable, and cost-effective solutions that harness the power of both on-premises technologies and Cloud platforms like Azure and AWS. If you possess a blend of Cloud expertise, hands-on data engineering experience, and the ability to communicate complex ideas with creativity and clarity, we want to hear from you!
Your Mission, Should You Choose to Accept:
- Design and implement scalable data pipelines using Cloud services such as Glue, Redshift, S3, Lambda, EMR, Athena, Microsoft Fabric, and Databricks.
- Develop and maintain ETL processes to transform and integrate data from diverse sources.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver top-notch data solutions.
- Optimize and fine-tune the performance of data pipelines and queries.
- Ensure data quality and integrity through rigorous testing and validation processes.
- Implement data security and compliance best practices.
- Monitor and troubleshoot data pipeline issues, ensuring timely resolutions.
- Stay abreast of the latest developments in data engineering technologies and best practices.
Your Toolkit:
- A Bachelor's degree from an accredited university.
- 3+ years, hands-on experience in data engineering; prior experience in the financial services industry is a plus but not mandatory. Bonus points if you have experience with banks and insurers!
- Proven skills in building and managing data solutions using on-premises technologies or Cloud platforms.
- Familiarity with core Cloud Data Services, including Glue, Redshift, S3, Lambda, EMR, Athena, Microsoft Fabric, or Databricks.
- Knowledge of big data technologies such as Apache Spark, Hadoop, or Kafka.
- Proficiency in programming languages like Python (with Pandas) and SQL.
- Experience working with relational databases like AWS RDS, MS SQL, Azure SQL DB, or Postgres.
- Solid understanding of data modeling, ETL processes, and data warehousing.
- Experience with Infrastructure as Code (IaC) tools like AWS CloudFormation, Terraform, or Azure ARM/Bicep.
- Familiarity with CI/CD tools to streamline software development and delivery.
- Excellent communication, problem-solving, and analytical skills, with the ability to present complex technical concepts clearly and concisely.
- Relevant Cloud certifications are a plus, showcasing your commitment to professional development and mastery of Cloud services and best practices.
- Strong desire to learn and enhance business knowledge.
If you're ready to embark on a data-driven adventure and elevate your career to new heights, apply now and let's revolutionize the world of data together!