Job Openings Data Engineer (Qlik/Snowflake) - Remote Lisbon (Only 3 Office Visits Total)

About the job Data Engineer (Qlik/Snowflake) - Remote Lisbon (Only 3 Office Visits Total)

ABOUT THE OPPORTUNITY

Join a leading European logistics technology company powering millions of daily operations across 41 countries while working in a truly flexible remote environment. This offers the chance to work across multiple data engineering projects, designing and building optimized data pipelines, integrations, and analytical structures that drive business insights and operational excellence. Operating as essentially full remote, you'll only need to visit the Venda do Pinheiro office 3 times: Day 1 for equipment pickup, final day for equipment return, and one team alignment meeting if necessary. This exceptional flexibility allows you to work from anywhere in Portugal while contributing to mission-critical data infrastructure supporting logistics operations at scale. You'll work with modern cloud data platforms including Snowflake, relational databases like PostgreSQL, and business intelligence tools such as Qlik Sense and QlikView, collaborating with data analysts, software developers, and business stakeholders in an Agile environment focused on delivering reliable, scalable data solutions.

PROJECT & CONTEXT

You'll design, implement, and maintain scalable data pipelines and ETL/ELT processes across multiple systems within a dynamic multi-project environment. Your work will focus on developing and optimizing database structures ensuring high data quality, performance, and availability while working closely with business stakeholders and analysts to translate data needs into technical solutions. Responsibilities include supporting and evolving data warehouse and reporting environments—particularly in Snowflake and Qlik platforms—ensuring data consistency, accuracy, and integrity across systems and projects. You'll contribute to improving existing data flows, storage strategies, and analytical frameworks, leveraging your expertise in SQL performance tuning, Snowflake data modeling and ELT patterns, and Qlik dashboard development. The role requires strong data analysis skills to transform raw data into actionable insights, excellent debugging and troubleshooting capabilities, and a customer/user-centric approach to delivering solutions that directly impact business success while maintaining data governance standards and metadata management practices.

WHAT WE'RE LOOKING FOR (Required)

  • PostgreSQL Expertise: Strong hands-on experience with PostgreSQL including advanced SQL, performance tuning, and schema design
  • Snowflake Proficiency: Solid experience with Snowflake cloud data platform including data modeling, ELT patterns, and performance optimization
  • Qlik Experience: Practical background with Qlik Sense or QlikView for data modeling, dashboard creation, and business reporting
  • Data Engineering Concepts: Solid understanding of ETL/ELT pipelines, data lakes, and data warehouse architectures
  • Database Development: Strong skills in database structure design, optimization, and maintenance
  • Data Integration: Experience designing and implementing scalable data pipelines and system integrations
  • SQL Mastery: Advanced SQL capabilities for complex queries, optimization, and data transformation
  • Data Quality Focus: Understanding of data governance, data quality frameworks, and metadata management
  • Data Analysis: Strong analytical skills translating data into actionable business insights
  • Linux Proficiency: Comfortable working in Linux-based environments for data engineering tasks
  • Problem-Solving: Excellent debugging, troubleshooting, and optimization techniques
  • Time Management: Good organizational skills managing multiple priorities across projects
  • Collaboration: Team player with strong communication skills working with cross-functional teams
  • Ownership Mindset: Sense of pride and responsibility for work quality and business impact
  • User-Centric: Customer/user-focused approach to solution development
  • Language: B2 English (Upper Intermediate) minimum for stakeholder communication
  • Seniority: Senior-level experience with data engineering in production environments

NICE TO HAVE (Preferred)

  • Scripting Languages: Experience with Python or Java for data processing and automation
  • Cloud Platforms: Exposure to AWS, Azure, or GCP data services beyond Snowflake
  • Additional BI Tools: Familiarity with Power BI, Tableau, or other visualization platforms
  • Data Orchestration: Experience with Airflow, Azure Data Factory, or similar workflow tools
  • API Development: Building data APIs or microservices for data access
  • Version Control: Proficiency with Git and collaborative development practices
  • CI/CD Pipelines: Experience with automated deployment for data solutions
  • Data Modeling: Dimensional modeling and star schema design experience
  • NoSQL Databases: Exposure to MongoDB, Cassandra, or other NoSQL technologies
  • Containerization: Docker or Kubernetes knowledge for data applications
  • Agile Methodologies: Experience working in Scrum or Kanban environments
  • Logistics/Supply Chain: Background in logistics, supply chain, or operations technology domains
  • Data Security: Understanding of data encryption, masking, and compliance requirements
  • Performance Monitoring: Experience with data pipeline monitoring and alerting systems

Location: Remote Portugal (Based in Lisbon/Venda do Pinheiro area - only 3 office visits