Job Openings Data Integration Engineer

About the job Data Integration Engineer

About the Role

The Senior Data Integration Engineer is responsible for designing, developing, and operating end-to-end data integration pipelines that support enterprise data platforms. The role covers environment setup, pipeline development, and operational support while ensuring reliable, scalable, and secure data delivery. The engineer collaborates closely with product teams, business analysts, vendors, and source system teams to implement integration solutions and maintain data quality.

Key Responsibilities

Data Integration Development

  • Design, build, and test ETL/ELT pipelines to integrate data from multiple source systems into target data platforms.

  • Define data mappings and transformation rules with business analysts and stakeholders.

  • Ensure data quality through validation, testing, and monitoring processes.

  • Produce technical designs and documentation for data integration solutions.

  • Evaluate emerging technologies and automation approaches to improve data delivery.

Environment & Platform Management

  • Implement cloud-based data integration solutions using tools such as Azure Data Factory, Databricks, and related services.

  • Support setup and maintenance of development and production environments.

  • Maintain CI/CD pipelines and deployment processes.

  • Ensure secure, scalable, and event-driven data integration architecture.

Operations & Support

  • Monitor and maintain production data pipelines.

  • Troubleshoot and resolve integration issues to ensure stable operations.

Collaboration & Continuous Improvement

  • Participate in Agile development activities (Scrum/Kanban).

  • Apply best practices in coding, testing, and deployment.

  • Continuously improve processes and recommend enhancements to the data platform.

Qualifications

  • Bachelors degree in Computer Science, Information Systems, or related field.

  • At least 5 years of experience in data engineering, ETL development, or system integration.

  • Strong experience in the Azure data ecosystem (Azure Data Factory, Databricks, Data Lake, Blob Storage, Azure SQL).

  • Proficiency in at least one programming language such as Python, Java, C#, Node.js, or Go.

  • Experience developing and integrating REST APIs in enterprise environments.

  • Hands-on experience with CI/CD pipelines and Azure DevOps.

  • Strong collaboration and communication skills.

  • Familiarity with ITIL practices and Agile frameworks.

Preferred

  • Experience with Azure Service Bus or microservices frameworks.

  • Exposure to other cloud platforms (AWS or OCI).

  • Experience with the Oracle ecosystem.

  • Azure certifications (e.g., AZ-900, DP-900 or higher).

Work Arrangement

  • Hybrid setup, with onsite work several days per week.

  • Work schedule: 8:00 AM – 5:00 PM (Manila Time).