Job Openings
G08 - Data Engineer
About the job G08 - Data Engineer
We are looking for experienced data engineers to join our team who will be responsible for:
Data Engineering and Platform Integration
- Design, develop, and maintain data pipelines and ETL processes using AWS services (Glue, Athena, S3, RDS)
- Work with data virtualisation tools like Denodo and develop VQL queries
- Ingest and process data from various internal and external data sources
- Perform data extraction, cleaning, transformation, and loading operations
- Implement automated data collection processes including API integrations when necessary
Data Architecture
- Design and implement data models (conceptual, logical, and physical) using tools like ER Studio
- Develop and maintain data warehouses, data lakes, and operational data stores
- Develop and maintain data blueprints
- Create data marts and analytical views to support business intelligence needs using Denodo, RDS
- Implement master data management practices and data governance standards
Technical Architecture and Integration
- Ensure seamless integration between various data systems and applications
- Implement data security and compliance requirements
- Design scalable solutions for data integration and consolidation
Development and Analytics
- Develop Python scripts in AWS Glue for data processing and automation
- Write efficient VQL/SQL queries and stored procedures
- Design and develop RESTful APIs using modern frameworks and best practices for data services
- Work with AWS Sagemaker for machine learning model deployment and integration
- Manage and optimise database performance, including indexing, query tuning, and maintenance
- Work in an Agile environment and participate in sprint planning, daily stand-ups, and retrospectives
- Implement and maintain CI/CD pipelines for automated testing and deployment
- Participate in peer code reviews and pair programming sessions
Documentation and Best Practices
- Create and maintain technical documentation for data models and systems
- Follow industry-standard coding practices, version control, and change management procedures
Stakeholder Collaboration
- Partner with cross-functional teams on data engineering initiatives
- Gather requirements, conduct technical discussions, implement solutions, and perform testing
- Collaborate with Product Managers, Business Analysts, Data Analysts, Solution Architects, UX Designers to build scalable, data-driven products
- Provide technical guidance and support for data-related queries
Qualifications and Experience:
- At least 3 years of experience in data engineering or similar role
- Strong proficiency in Python, VQL, SQL
- Experience with AWS services (Glue, Athena, S3, RDS, Sagemaker)
- Knowledge of data virtualisation concepts and tools (preferably Denodo)
- Experience with BI tools (preferably Tableau, Power BI)
- Understanding of data modelling and database design principles
- Familiarity with data governance and master data management concepts
- Experience with version control systems (Gitlab) and CI/CD pipelines
- Experience working in Agile environments with iterative development practices
- Strong problem-solving skills and attention to detail
- Excellent communication skills and ability to work in a team environment
- Knowledge of AI technologies (AWS Bedrock, Azure AI, LLMs) would be advantageous