Job Openings Data Engineer

About the job Data Engineer


Our client leader in the SaaS and Network Solutions data is at the core of their platform. The Data Engineering Team ensures data flows seamlessly through our systems, from collection to analysis. As a Data Engineer, you'll manage complex data storage systems, design efficient pipelines, and enable data products for near real-time analytics.

You will also play a key role in managing and optimizing our data lake infrastructure, which is built on Snowflake, ensuring scalable, high-performance data processing and accessibility. We're looking for someone skilled in Python and DBT,with a strong ability to bring fresh ideas and innovative solutions to our data engineering processes. We are seeking a skilled Data Engineer to enhance our data infrastructure and support our analytics and business intelligence initiatives.

What Youll Do

  • Design and Maintain Data Pipelines: Build robust, scalable data pipelines using Snowflake to ensure high availability and reliability of data processing.
  • Utilize DBT for Transformations: Manage data transformations with DBT, including building, testing, and documenting data models to ensure data accuracy and usability.
  • Collaborate Across Teams: Work with data scientists, analysts, and business stakeholders to understand data needs and deliver comprehensive data solutions.
  • Implement ETL Processes: Extract, transform, and load data from various sources into Snowflake, ensuring efficiency and reliability.
  • Optimize Data Models: Improve existing data models and warehouse performance to enhance query efficiency and reduce operational costs.
  • Monitor and Troubleshoot: Proactively identify and resolve data issues, ensuring data integrity and accuracy across the ecosystem.
  • Document Workflows: Maintain thorough documentation for data workflows, processes, and architecture to support knowledge sharing and team training.
  • Stay Current: Keep up with industry trends and advancements in cloud data warehousing, data modelling tools, and best practices.


What Were Looking For

We value data engineers who are innovative, collaborative, and passionate about building robust and scalable data solutions. The ideal candidate has a strong foundation in data engineering principles and thrives in a dynamic, fast-paced environment.

Needed Qualifications:

  • Proven experience as a Data Engineer, with a focus on Snowflake and DBT.
  • Proficiency in SQL and experience with programming languages such as Python or Java.
  • Experience with ETL tools such as FiveTran, Airbyte, Apache Airflow, or Talend, and data integration techniques.
  • Strong understanding of data modeling principles and data warehousing concepts.
  • Familiarity with cloud platforms, particularly AWS, and its services such as RDS, S3, and Lambda.
  • Excellent analytical and problem-solving skills.
  • Strong communication and teamwork abilities.

Preferred Qualifications

  • Expertise in optimizing Snowflake performance and cost-efficiency.
  • Experience with real-time data streaming platforms such as Apache Flink or Kafka.
  • Familiarity with monitoring and observability tools for data pipelines and distributed systems.
  • Exposure to Kubernetes or other container orchestration tools.
  • Experience with Salt Stack or similar configuration management tools.
  • Ability to innovate and implement cutting-edge data engineering practices.

Dont Meet Every Requirement? Apply Anyway! Sending us your up-to-date CV.


Only shortlisted applicants will be contacted with information about next steps of our selection process.


Applicants are treated with strict confidentiality following the applicable personal data protection legislation.