Job Openings Senior Data Engineer (Azure/Databricks/PySpark) - Hybrid Porto (2 days/week office)

About the job Senior Data Engineer (Azure/Databricks/PySpark) - Hybrid Porto (2 days/week office)

Senior Data Engineer (Azure/Databricks/PySpark) - Hybrid Lisbon (2 days/week office)

ABOUT THE OPPORTUNITY

We are looking for a Senior Data Engineer to join a leading, data-driven organization operating in a modern cloud environment. This opportunity is ideal for professionals passionate about building scalable data platforms and working with cutting-edge technologies within the Microsoft Azure ecosystem.

You will play a key role in designing and delivering robust data solutions that support critical business decisions. This is a highly technical position requiring strong ownership, a hands-on mindset, and experience working in production-grade environments.

PROJECT & CONTEXT

You will be part of a growing data platform team focused on developing and optimizing a modern data lakehouse architecture. The environment leverages Azure Data Factory v2, Azure Synapse Analytics, and Azure Databricks (runtime 13.x+), enabling large-scale data processing and advanced analytics.

The project involves working with complex data pipelines, integrating multiple data sources, and ensuring high performance and reliability across the platform. There is also a strong emphasis on DevOps practices, including CI/CD pipelines using Azure DevOps, as well as ongoing adoption of Microsoft Fabric capabilities.

WHAT WE'RE LOOKING FOR (Required)

  • Proven experience as a Senior Data Engineer in production environments
  • Strong expertise in ETL/ELT pipeline development
  • Hands-on experience with Azure Data Factory v2, Azure Synapse Analytics, and Azure Databricks (Spark 3.x)
  • Solid experience processing large datasets using Python (3.x) and PySpark
  • Advanced knowledge of SQL (T-SQL or ANSI SQL)
  • Strong understanding of data modeling techniques (relational and dimensional)
  • Experience implementing CI/CD pipelines using Azure DevOps
  • Familiarity with data lake / lakehouse architectures
  • Ability to gather business requirements and translate them into scalable technical solutions
  • Experience working across the full data lifecycle (design, development, deployment, maintenance)
  • Fluent in English (mandatory)

NICE TO HAVE (Preferred)

  • Experience with Microsoft Fabric
  • Knowledge of Dynamics 365 Finance & Operations (D365 F&O) data integration
  • Familiarity with Azure Synapse Link
  • Experience with Power BI for data validation and visualization
  • Exposure to infrastructure-as-code (IaC) or advanced DevOps practices
  • Experience in highly regulated or data-intensive industries