About the job Mid Data Engineer (Databricks) - Hybrid Lisbon (2 days/week on-site)
ABOUT THE OPPORTUNITY
Join the on data transformation initiatives that power critical business intelligence and operational reporting across enterprise organizations. This role offers the chance to work with modern cloud-based data platforms, focusing on Databricks as the primary data engineering environment while building scalable data pipelines and analytical models. Operating in a flexible hybrid model with 2 days per week in the Lisbon office, you'll contribute to designing and implementing data solutions that drive strategic decision-making through financial and operational reporting. As a Mid-level Data Engineer, you'll work with cutting-edge cloud infrastructure across Azure, GCP, or AWS, developing both relational and analytical data models that serve diverse stakeholder needs. This position provides excellent exposure to enterprise-scale data architectures, DevOps practices, and the opportunity to grow your expertise in the rapidly evolving data engineering landscape while collaborating with talented professionals in a consulting environment.
PROJECT & CONTEXT
You'll be responsible for designing, building, and maintaining data pipelines and infrastructure using Databricks as the core platform. Your work will focus heavily on creating robust data models—both relational and analytical—that support financial reporting, operational dashboards, and business intelligence initiatives across the organization. Working within cloud-based data infrastructures (Azure, GCP, or AWS), you'll implement ETL/ELT processes that ensure data quality, accuracy, and availability for downstream consumers. The role requires translating business requirements into technical data solutions, optimizing data workflows for performance and scalability, and collaborating with data analysts, business stakeholders, and other engineers to deliver value through data. You'll participate in the full data lifecycle from ingestion and transformation to modeling and delivery, ensuring solutions meet both functional requirements and technical standards while maintaining documentation and following best practices for version control and deployment.
WHAT WE'RE LOOKING FOR (Required)
- Academic Background: Degree in Computer Engineering, Electrical Engineering, Information Systems Management, or related technical fields
- Data Engineering Experience: 3+ years proven experience working as a Data Engineer in production environments
- Databricks Expertise: Solid hands-on experience with Databricks platform for data engineering workloads—this is mandatory
- Data Modeling Proficiency: Strong skills in designing relational and analytical data models specifically for financial and operational reporting
- Cloud Infrastructure: Practical experience with cloud-based data platforms on Azure, GCP, or AWS
- ETL/ELT Development: Building and maintaining data pipelines, transformations, and workflows
- SQL Skills: Advanced SQL capabilities for data manipulation, analysis, and modeling
- Language: B2 English (Upper Intermediate) minimum for technical communication and documentation
- Work Model: Availability for hybrid schedule with 2 days per week in Lisbon office
- Location: Based in Portugal with ability to commute to Lisbon office regularly
NICE TO HAVE (Preferred)
- DevOps Experience: Background with DevOps practices, CI/CD pipelines, and infrastructure automation
- Python/PySpark: Programming skills for advanced data transformations and pipeline development
- Delta Lake: Experience with Delta Lake for reliable data lakes and ACID transactions
- Apache Spark: Deep understanding of Spark architecture and optimization techniques
- Data Warehousing: Knowledge of dimensional modeling, star schemas, and data warehouse design patterns
- Version Control: Proficiency with Git and collaborative development workflows
- Databricks Features: Experience with Unity Catalog, Delta Live Tables, or Databricks Workflows
- BI Tools: Familiarity with Power BI, Tableau, or other business intelligence platforms
- Agile Methodologies: Experience working in Scrum or Kanban environments
- Data Quality Frameworks: Implementation of data validation and quality checks
- Performance Tuning: Optimization experience for large-scale data processing workloads
- Certifications: Databricks Certified Data Engineer Associate or cloud platform certifications
Location: Lisbon, Portugal (Hybrid - 2 days/week on-site)