Job Openings Senior Data Platform Engineer (Python/Azure/Kubernetes) - Remote Portugal

About the job Senior Data Platform Engineer (Python/Azure/Kubernetes) - Remote Portugal

ABOUT THE OPPORTUNITY

Join a world-class technology consultancy as a Senior Data Platform Engineer, bridging data and software engineering to build scalable, reliable infrastructure. You'll design and maintain cloud-native data platforms using Python, Terraform, and Kubernetes while creating smooth data processing and deployment experiences. This role offers you the opportunity to apply a software engineer's mindset to solve complex infrastructure and data modeling challenges, working in a collaborative environment where you'll translate technical concepts to leadership and partners.

PROJECT & CONTEXT

You'll build software services and APIs using Python frameworks (ideally FastAPI) to create reliable data processing and deployment experiences. The role involves designing and maintaining secure cloud infrastructure from a software engineering perspective, handling core networking components including WAF and CDN on Azure. You'll use Terraform to manage infrastructure as code and deploy scalable services using AKS (Azure Kubernetes Service) and Docker. Implementing end-to-end data solutions, you'll ensure data flows correctly across real-time streaming architectures (Kafka/Kinesis) and batch processing systems. Working with data platforms like Databricks, Snowflake, and Airflow, you'll apply data modeling methodologies including Kimball, Inmon, and Data Vault. Strong focus on CI/CD workflows, engineering principles, and architectural best practices to deliver high-impact solutions in fast-paced environments.

WHAT WE'RE LOOKING FOR (Required)

  • 5+ years in data roles with proven track record of end-to-end delivery in fast-paced environments
  • Strong Python development: Deep expertise organizing Python projects and building web services using frameworks like FastAPI
  • Azure cloud mastery: Deep experience with Azure cloud infrastructure including Networking, WAF, and CDN configurations
  • Multi-cloud exposure: Solid familiarity with AWS and GCP platforms
  • Infrastructure as Code: Proficient in Terraform for managing cloud infrastructure
  • Container orchestration: Strong hands-on experience with AKS (Azure Kubernetes Service), Kubernetes, and Docker
  • Data modeling expertise: Strong understanding of Kimball dimensional modeling, Inmon enterprise data warehouse, and Data Vault methodologies
  • Data platform experience: Hands-on with Databricks, Snowflake, and Apache Airflow for data processing and orchestration
  • Streaming technologies: Experience with real-time data streaming using Apache Kafka or AWS Kinesis
  • Software engineering mindset: Solid understanding of engineering and architectural principles applied to infrastructure and data challenges
  • CI/CD collaboration: Experience implementing and maintaining CI/CD workflows and pipelines
  • Communication excellence: Ability to translate technical concepts to leadership, partners, and diverse audiences
  • Language requirement: Fluent English (mandatory)

NICE TO HAVE (Preferred)

  • Experience with additional Python frameworks (Django, Flask)
  • Knowledge of advanced Kubernetes patterns and service mesh technologies
  • Familiarity with data quality and observability tools
  • Background in DataOps practices and automation
  • Experience with API gateway and service mesh implementations
  • Contributions to open-source data engineering projects