Job Openings
Cloud Data Engineer
About the job Cloud Data Engineer
Contract Position
Minimum Requirements:
- Bachelor's degree with 3 years related professional experience
- Relevant certifications, e.g., SAP, Microsoft Azure
- Must have at least 10 years of cloud data engineering experience
- Bachelors degree in Computer Science, Computer Engineering or a related field required
- 8 years of experience as a Cloud Data Engineer
Technologies:
- Azure
- SQL
- Databricks
Required Experience:
- Seasoned experience as a data engineering, data mining within a fast-paced environment
- Proficient in building modern data analytics solutions that delivers insights from large and complex data sets with multi-terabyte scale
- Seasoned experience with architecture and design of secure, highly available and scalable systems.
- Proficient in automation, scripting and proven examples of successful implementation
- Seasoned experience in any applicable language, preferably .NET
- Proficient in working with SAP, SQL, MySQL databases and Microsoft SQL
- Seasoned experience working with data sets and ordering data through MS Excel functions, e.g. macros, pivots
- Seasoned experience working in internet technologies, e.g., SaaS
Responsibilities:
- Create data models in a structured data format to enable analysis thereof
- Design and develop scalable extract, transform and loading (ETL) packages from the business source systems and the development of ETL routines to populate data from sources
- Participate in the transformation of object and data models into appropriate database schemas within design constraints.
- Interpret installation standards to meet project needs and produce database components as required
- Create test scenarios and be responsible for participating in thorough testing and validation to support the accuracy of data transformations
- Develops and maintains mission-critical information extraction, analysis, and management systems
- Provides direct and responsive support for urgent analytic needs
- Translates loosely defined requirements into solutions.
- Uses open source technologies and tools to accomplish specific use cases encountered within the project
- Uses coding languages or scripting methodologies to solve a problem with a custom workflow
- Performs incremental testing actions on code, processes, and deployments to identify ways to streamline execution and minimize errors encountered
Knowledge and skills:
- Seasoned in the definition and management of scoping requirements, definition and prioritisation activities
- Good understanding of database concepts, object and data modelling techniques and design principles and conceptual knowledge of building and maintaining physical and logical data models
- Specialist either in Microsoft Azure Data Factory, SQL Analysis Server, SAP Data Services, SAP BTP
- Good understanding of data architecture landscape between physical and logical data models