Job Openings Data Engineer (GCP/Python/AWS) - Hybrid Portugal (Porto or Lisbon)

About the job Data Engineer (GCP/Python/AWS) - Hybrid Portugal (Porto or Lisbon)


ABOUT THE OPPORTUNITY

A well-established organisation with a strong data culture is looking for a talented Data Engineer to join their team on a consulting basis. You'll be working on data infrastructure that matters — pipelines, platforms, and processing systems that underpin real business decisions. Based in Portugal (Porto or Lisbon), this role offers flexibility in work model while keeping you connected to a collaborative, technically ambitious team.

PROJECT & CONTEXT

You'll be embedded in a data engineering environment where the stack spans multiple cloud platforms — primarily Google Cloud Platform (GCP), with touchpoints across AWS — and where the challenges are genuinely complex. Think large-scale data pipelines, distributed processing, real-time streaming, and integration across heterogeneous data sources. This isn't a maintenance role — it's an opportunity to build and evolve data infrastructure that scales.

WHAT WE'RE LOOKING FOR

  • Hands-on experience with Google Cloud Platform (GCP) for data engineering workloads — pipelines, storage, and processing services
  • Strong proficiency in Python as the primary programming language; Java or Scala is a plus
  • Practical experience with Big Data tooling, including technologies such as Apache Spark, Apache Kafka, Flink, Elasticsearch, Hadoop, Hive, or similar distributed processing frameworks
  • Solid knowledge of data modeling and database design principles, applied to both analytical and operational contexts
  • Experience with data integration and ETL/ELT pipelines, including tools such as Apache Kafka or Talend
  • Understanding of distributed systems architecture and data processing patterns at scale
  • Strong SQL skills with experience across both relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra, Elasticsearch)
  • Familiarity with additional cloud platforms including AWS (e.g., S3, Glue, Redshift) or Azure (e.g., Azure Data Factory)
  • Experience using Git for version control in a collaborative development environment
  • English at B2 level (Upper Intermediate) or above — required for daily communication with technical and business stakeholders
  • Based in Portugal (Porto or Lisbon area)

NICE TO HAVE

  • Experience with stream processing frameworks such as Kafka Streams, Kafka Connect, or Apache Flink
  • Familiarity with tools like Druid, Impala, Sqoop, or Flume in a production data lake context
  • Exposure to infrastructure-as-code or CI/CD pipelines for data workflows
  • Experience working with dbt or similar data transformation frameworks
  • Background in consulting or multi-client data engineering environments