About the job Snowflake Data Engineer
Intellinexus in a nutshell:
Intellinexus is a business-led data and business intelligence consultancy that empowers organisations with data-driven decision-making capabilities. We combine innovation and expertise with access to the best talent to help organisations become smarter, more agile, and more resilient to the changes in their operating environments. Our expert teams apply a mastery of data and technology to craft strategies that revolutionise business decision-making, and give organisations access to real-time, actionable insights into the performance of their business. For more information, please visit www.intellinexusgroup.com.
Intellinexus, What we offer:
- Intellinexus offers remote and hybrid opportunities
- Intellinexus is an SME and therefore you get a lot more exposure to different technologies, solutions, techniques and perspectives during projects, which you might be excluded from in other firms with specialists in those areas
- Training and development are very important to Intellinexus and time and resources are allocated to grow the consultants' careers
What you will do:
The Snowflake Data Engineer will work collaboratively with the Project Managers, Data Scientists, Systems Architects to define data sources, data ingestion, modelling and reporting artefacts as defined in a custom data framework. Together these teams will enable data driven actionable insights.
Core responsibilities include:
- Work within a highly specialized and growing team to enable delivery of data and advanced analytics system capability
- Develop and implement a reusable architecture of data pipelines to make data available for various purposes including Machine Learning (ML), Analytics and Reporting
- Work comfortably with structured and unstructured data in a variety of different programming languages such as SQL, R, python, Java etc
- A solid experience and understanding of architecting, designing and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse is a must
- Provide support for Data Warehouse issues such as data load problems etc.
Understanding data pipeline requirements and which tools to leverage to get the job done
- Testing and clearly documenting implementations, so others can easily understand the requirements, implementation, and test conditions
- Build data solutions that leverage controls to ensure privacy, security, compliance, and data quality
- Understand meta-data management systems and orchestration architecture in the designing of pipelines
- System integration skills between Business Intelligence and source transactional
Qualification & Experience:
- Honours or Masters degree in BSc Computer Science or Engineering or Software Engineering
- Minimum 2 years of developing a fully operational production grade large scale data solution on Snowflake Data Warehouse
- 4 years of hands on experience with building productized data ingestion and processing pipelines using standard data pipeline techniques (Spark, Python, ADF, etc..)
- 2 years of hands-on experience designing and implementing production grade data warehousing solutions on large scale data technologies such as SQL Server, Teradata, Oracle or DB2
- Expertise in implementing role-based access controls on Snowflake
- Ability to write stored procedures and complex queries in Snowflake
- Excellent presentation and communication skills, both written and verbal
- Ability to problem solve and architect in an environment with unclear requirements
- AWS experience a bonus
- SnowPro certification preferred
If you are interested:
Please forward your cv to: [email protected] OR Contact me on: 0620760768.