Job Openings Senior Data Engineer (Azure/Databricks/Python) - Remote Portugal

About the job Senior Data Engineer (Azure/Databricks/Python) - Remote Portugal

ABOUT THE OPPORTUNITY

Join a world-class technology consultancy as a Senior Data Engineer, playing a key role in designing, building, and maintaining data infrastructure that drives data-driven decision-making. You'll collaborate with cross-functional teams to ensure availability, reliability, and accessibility of data assets, enabling the organization to extract actionable insights and deliver high-impact solutions. This role offers you the opportunity to build robust data pipelines, create data products, and work closely with business partners and developers in a fast-paced environment.

PROJECT & CONTEXT

You'll architect and maintain data pipelines supporting Business Intelligence, Data Science, and analytics initiatives for Business and Studio deliverables. The role involves designing data models using Azure Data Factory and Databricks, handling ETL/ELT workflows that transform raw data into actionable insights. You'll build APIs and data products to integrate data throughout organizational systems, monitor pipeline health, and ensure data quality. Working closely with business partners, developers, and leadership, you'll translate requirements into technical solutions while coordinating cross-functional initiatives. You'll leverage modern data modeling methodologies including Kimball dimensional modeling, Inmon enterprise data warehouse, and Data Vault 2.0 to create scalable, maintainable architectures. National and international travel varies by project (0-15% estimated).

WHAT WE'RE LOOKING FOR (Required)

  • Proven Data Engineering experience in fast-paced production environments
  • Expert-level SQL proficiency with SQL and SQL-like query languages for complex data manipulation
  • Deep Python expertise with experience organizing and structuring Python-based data projects
  • Azure Data Factory and Databricks: Vast hands-on experience building data solutions and pipelines
  • ETL/ELT expertise: Experience designing and implementing data transformation processes
  • Cloud platform knowledge: Experience with Microsoft Azure or AWS and their data services ecosystem
  • Data modeling expertise: Strong understanding of Kimball, Inmon, and Data Vault methodologies
  • API development: Ability to build data products and APIs for system integration
  • Collaboration skills: Strong partnership abilities, working effectively with business stakeholders, developers, and leadership
  • Communication excellence: Ability to tailor communication for different audiences and document requirements clearly
  • Language requirement: Fluent English (mandatory)

NICE TO HAVE (Preferred)

  • Automated testing frameworks for data projects (unit and integration testing)
  • Business Intelligence and Business Analyst experience
  • Infrastructure as Code with Terraform or CloudFormation
  • Data visualization tools proficiency
  • Snowflake data warehouse platform experience
  • CI/CD automation using GitLab
  • Experience integrating data platforms with observability and monitoring tools
  • Strong partnership and stakeholder management skills across technical and business teams