Job Openings
Intermediate Data Solutions Engineer
About the job Intermediate Data Solutions Engineer
Intermediate Data Solutions Engineer
Exciting Opportunity for an Intermediate Data Solutions Engineer
Elevate your career, where we prioritise people and nurture a culture of growth, collaboration, and innovation.
Join a dynamic, multidisciplinary environment that encourages versatility, and provides individual attention to career development, including intentional and regular one-on-ones with leadership.
Tackle complex challenges and make a meaningful impact alongside passionate professionals who inspire and support each other.
WHAT YOU WILL BE DOING
DELIVERY
- Wrangling big data from multiple sources into a reliable, evolving asset.
- Collaborating with Solution Architects to design and implement cloud, hybrid, and on-prem solutions.
- Designing and implementing optimal data pipelines and delivering impactful advanced analytics using ML, AI, cognitive services, and data science.
- Working across descriptive, diagnostic, predictive, and prescriptive analytics.
- Familiarising yourself with ML algorithms and AI approaches, including clustering, regression, classification, speech recognition, OpenAI, LLMs, vision, search, and translation.
- Understanding DataOps and MLOps to ensure continuous deployment in short cycles with maximum impact and high quality.
- Maintaining data lineage and metadata effectively.
- Designing and developing business dashboards.
- Using technical and data analytics expertise to improve business innovations.
people
- Demonstrating maturity, time management and motivation in the workplace.
- Championing change within the team and fostering a collaborative culture.
- Collaborating with cross-functional teams to tackle complex business challenges, driving continuous improvement.
- Upholding standards of excellence and accountability.
Customer
- Building strong relationships and understanding customer needs to deliver impactful solutions.
- Managing customer expectations and identifying opportunities for service improvements.
- Ensuring compliance with data security and industry regulations.
WHAT WE NEED FROM YOU
- A bachelors degree in IT, Engineering, or related fields.
- Minimum 4 years experience as a data solutions engineer in an enterprise environment.
- Proficiency in software engineering fundamentals and commitment to clean code and best practices.
- Experience working in a delivery team together with other disciplines to deliver a product.
- Strong ability to model and analyse data.
- Solid understanding of data engineering methodologies.
- Good understanding of visualisation best practices.
- A desire to design and implement modern advanced analytics solutions and modern data warehousing solutions.
- Strong collaboration and communication skills with a drive for continual improvement.
- Pro hybrid working model working from the office on average 2 – 3 times per week.
- A strong customer-facing ability.
- Flexibility to work across a variety of different environments and tooling.
WHAT YOU WILL GET FROM US
- A competitive compensation package with twice-yearly salary increases and guaranteed bonuses.
- Tenure-based loyalty leave.
- Access to employee wellness programs and rewards for professional development.
- Commission incentives for successful business development leads and employee referrals.
- Hands-on support from leadership through regular one on ones and quarterly reviews.
- Continuous development programs offering training in leadership and technical skills.
- Exposure to enterprise projects across multiple domains and problem spaces for South Africas most esteemed organisations.
THE TECH STACKS WE USE
*Popular but not limited to
- Cloud environments: Azure, AWS, Google
- Front-end tech: Power BI, Microsoft Fabric, Amazon Quicksight, Google Data Studio, Grafana
- Languages: SQL, Python, Scala
- Data processing: Azure Data Factory, Amazon Glue, Apache Airflow, Databricks, Spark SQL
- Relational databases: MSSQL, PostgreSQL, Oracle, MySQL, Azure SQL, Amazon RDS
- NoSQL databases: MongoDB, Cosmos DB, DynamoDB, Neo4j
- Streaming: Azure Stream Analytics, Kafka, Databricks streaming, Spark streaming, Google Dataflow, Amazon Kinesis
- ML Platforms: Azure Machine Learning, Databricks ML, Amazon Sagemaker, Dataiku
- AI: Azure OpenAI, Azure Cognitive Services, AWS Cognitive Services, Google Cognitive Services