Job Openings Senior Data Engineer (GCP)

About the job Senior Data Engineer (GCP)

LOCATIONS: LATAM / FULLY REMOTE

Role Overview:

Are you a seasoned Data Engineer with comprehensive experience in Enterprise Data Warehouses, modern API development, and Google Cloud Platform (GCP) services? We are in search of an expert who not only has a deep understanding of managing platform-related automation tasks but can also operationalize Data Warehouses, Data Lakes, Automation tasks, and Analytics platforms on GCP.

Key Technical Experience Required:

  • At least 5 years of direct involvement in Enterprise Data Warehouse technologies.
  • Minimum of 5 years of experience in software development for automation using languages such as Java, Python, and SQL.
  • At least 5 years of hands-on expertise with GCP Cloud data projects (Dataflow, Cloud Composer, Big Query, Cloud Storage, etc.)
  • Minimum 5 years in designing and deploying large-scale distributed data processing systems using technologies like MySQL, MemSQL, PostgreSQL, MongoDB, Hadoop, Spark, HBase, Teradata.
  • Proven experience with modern API development.

Key Skills:

  • GCP Services: Proficiency in Storage & Databases such as Cloud Storage, Cloud SQL, Cloud Datastore, Cloud Source Repositories, and Cloud Spanner.
  • Big Data Tools: Expertise in BigQuery, Cloud Dataflow, Dataproc, and Cloud Datalab.
  • Languages & Tools: Strong skills in Python and familiarity with Pub/Sub.
  • Automations: Expertise in setting up Automations & Triggers.
  • Other Skills: Fluent in English, excellent communication, self-driven, and innovative.

Preferred Credentials:

  • Google Professional Data Engineer Certification.

Nice-to-Have Skills:

  • Business Intelligence proficiency, especially with tools like Looker or Data Studio.

Position Specifics:

  • Assessment Path: Data Engineering
  • Contract Type: MSA signed
  • Location: Open, but preference given to LATAM
  • Working Hours: 6-8 hours overlap with US Eastern Standard Time (EST) business hours.
  • Essential Tech Stack: Must be proficient with GCP, Data Storage, Cloud Compute, Python, Apache Kafka, BigQuery, and Looker.

We are keen to onboard a professional who isn't just skilled but also aligns with our dedication to delivering exceptional solutions. If you're someone who thrives in a high-paced, challenging, and collaborative environment, then we'd love to have you on our team.