Job Openings M10 - Data Engineer

About the job M10 - Data Engineer

Overview

We are seeking a Data Engineer to support the design, development, and deployment of cloud-native applications and data platforms. The role involves building scalable data pipelines, developing backend services, and ensuring secure and reliable data platform operations.

The candidate will work closely with Product Managers, Business Analysts, Solution Architects, and QA Engineers to deliver secure, scalable, and high-performing solutions across data, application, and infrastructure layers.

Key Responsibilities

  • Design and develop scalable data pipelines (batch and near real-time) including data ingestion, transformation, and orchestration.
  • Develop and maintain backend services and APIs for secure data integration and access.
  • Implement and support enterprise data platforms such as data lakes and operational data stores.
  • Deploy and manage workloads using AWS services (e.g. ECS/Fargate, Lambda, S3, RDS, API Gateway, IAM, CloudWatch).
  • Build and maintain CI/CD pipelines with automated testing and security scanning.
  • Automate infrastructure using Infrastructure-as-Code (Terraform or CloudFormation).
  • Implement security controls including authentication, authorization, encryption, and secrets management.
  • Implement monitoring, logging, and observability across applications and cloud infrastructure.
  • Troubleshoot technical issues across data pipelines, APIs, and cloud environments.
  • Participate in Agile development processes and conduct code reviews to ensure quality and security best practices.

Requirements

  • Bachelors degree in Computer Science, Information Technology, Engineering, or related field.
  • Minimum 5 years of experience in data engineering, software engineering, or DevSecOps roles.
  • Hands-on experience with AWS cloud services.
  • Experience with CI/CD tools such as GitLab, Jira, Confluence, or SonarQube.
  • Experience with Infrastructure-as-Code tools (Terraform or CloudFormation).
  • Strong programming skills in Python, Node.js, JavaScript, TypeScript, or C#.
  • Experience developing APIs and microservices architectures.
  • Understanding of data pipelines, data transformation, and enterprise data platforms.
  • Strong problem-solving skills and ability to work in Agile environments.

Preferred

  • Experience with Government Commercial Cloud (GCC) or regulated environments.
  • Experience with container technologies (Docker, Kubernetes).
  • Exposure to AI / Large Language Models (e.g. OpenAI, Amazon Q, Gemini).