Job Openings
Senior Data Engineer (Hybrid) | Cubao
About the job Senior Data Engineer (Hybrid) | Cubao
Schedule: AU Hours
Work Setup: 3x Onsite (Gateway, Cubao)
Senior Data Engineer with extensive hands-on experience in Data Architecture and Engineering to serve as a technical contributor within our team. As a Senior Data Engineer, you will be responsible for designing, building, and deploying cutting-edge data solutions while contributing to establishing our data practice.
Responsibilities:
- Contribute to technical engagements and proactively identify opportunities to expand CI&T's business with clients.
- Collaborate on multiple projects across various domains, providing diverse subject matter expertise.
- Participate in technical design and architecture discussions to ensure robust solutions.
- Ensure adherence to security and performance best practices in all solutions.
- Create and maintain technical documentation required by clients.
- Communicate effectively with customer team members to ensure alignment and clarity of technical solutions.
- Understand client requirements and develop viable technical solutions by selecting appropriate frameworks.
- Guarantee a stable and productive development environment for all team members.
Requirements:
- 6+ years of experience in the software development area
- Proven expertise in Data Warehousing and Data Analytics projects, encompassing data acquisition, transformation, and data science initiatives.
- Exposure with modern data file formats such as Delta Tables, Apache Iceberg, and Parquet.
- Proficiency in Python, Spark, DBT, and other data transformation tools and frameworks.
- Expertise in data modelling, with familiarity in conceptual, logical, and physical data models, following methodologies like Data Vault 2.0 and Dimensional Modelling.
- Experience with cloud data services such as Microsoft Intelligent Data Platform, AWS Data Services, or Google Cloud Data Analytics.
- Minimum of 3 years experience with Snowflake
- Hands-on experience with cloud data platforms like Databricks and/or Snowflake; certifications.
- Skilled in data pipeline orchestration tools, including Azure Data Factory, AWS Glue, Apache Airflow, or Prefect.
- Experience with CI/CD and MLOps pipelines to streamline development and deployment processes.
- Proficient in Infrastructure as Code (IaC) for efficient infrastructure management.
- Expertise in data migration, performance tuning, and optimization of databases and SQL queries.
- Understanding of Data Governance and Data Management concepts.
- Understanding design patterns, clean architecture, and clean coding principles.
- Familiarity with unit, integration, and E2E testing.