Job Openings Data Engineer

About the job Data Engineer

Job Title: Senior Data Engineer or Data Engineer Lead

Reports To: Enterprise Data Group Heads
Employment Type: [Full-time]
Location: [On-site/Hybrid]

Job Summary

We are seeking a highly skilled and motivated Senior or Data Engineer Lead to play a pivotal role in our enterprise data initiatives. This position is responsible for designing, building, and maintaining scalable and efficient data pipelines and data products, ensuring alignment with both enterprise-wide and domain-specific goals. The ideal candidate will combine hands-on expertise in modern data engineering technologies with leadership capabilities to guide junior engineers, define modeling standards, and drive innovation in data architecture.

Key Responsibilities

Data Engineering & Pipeline Development

  • Design, build, and optimize robust ETL/ELT processes for ingesting data from various sources into Snowflake and Redshift.

  • Develop high-throughput data pipelines using technologies like Confluent Kafka, SQL, and Python.

  • Ensure data quality, reliability, and integrity through validation, monitoring, and testing processes.

  • Troubleshoot data processing issues and lead root cause analysis.

Data Product Design & Modeling

  • Build and maintain Common Data Products (enterprise-wide canonical models) or Business Data Products (domain-specific models with applied logic and calculations).

  • Collaborate with stakeholders and architects to translate business needs into scalable, reusable data models.

  • Define and enforce modeling standards, naming conventions, documentation practices, and test strategies using DBT.

Leadership & Collaboration

  • Mentor and guide junior engineers through code reviews, technical coaching, and design oversight.

  • Drive continuous improvement in data engineering practices, tool adoption, and process optimization.

  • Deliver technical and executive updates to stakeholders and leadership.

  • Participate in Proof of Concept (PoC) initiatives to evaluate and recommend new tools or frameworks.

Qualifications and Skills

Required:

  • Bachelors degree in Computer Science, Information Technology, Data Science, or a related field.

  • 7+ years of experience in data engineering, data architecture, or related roles.

  • Advanced proficiency in SQL and Python.

  • Strong experience with Snowflake, Redshift, and DBT for ELT modeling and data warehousing.

  • Hands-on experience with streaming platforms such as Confluent Kafka.

  • Proven ability to design and support data warehouses, data products, and data hubs.

  • Strong analytical skills and the ability to communicate complex data solutions clearly.

Preferred:

  • Familiarity with data mesh principles, modular data product design, or enterprise data governance.

  • Knowledge of data quality, data architecture, and master data management concepts.

  • Experience with CI/CD pipelines and modern data orchestration tools.

  • Familiarity with cloud-based data platforms and optimization of ELT data flows.

  • Experience working with Snowflake, DBT, and version control systems like Git.