Job Openings Global Snowflake Developer | Data Engineer | Hybrid or Remote

About the job Global Snowflake Developer | Data Engineer | Hybrid or Remote

Snowflake Developer | Data Engineer

Snowflake Medallion Architecture & CI/CD Pipelines


📍 Location: Argentina 🇦🇷 (Resistencia, Tandil), Mexico 🇲🇽 (Querétaro, Tepatiplán), Montevideo 🇺🇾, Cochabamba 🇧🇴, Valencia 🇪🇸

About the Role

👨‍💻 Are you a Snowflake expert who thrives on making sense of billions of events a month — and wants to build something that truly matters? 🧠✨

Join our data-driven team and help us shape the future of event tracking and consent intelligence. If you’re experienced, reliable, and passionate about turning complex pipelines into clean, actionable insights — we want to talk! 💬⚙️

🌍 We process several billion events per month, from the very first cookie banner load to the tiniest user interaction — that’s our superpower. We work on a medallion architecture over Snowflake to transform raw data into gold 🌟 — ultimately fueling tools like Planhat and driving critical business insights.

But there’s a twist… we currently work without a staging environment 😱. Everything runs in production. That’s why we need thoughtful engineers who understand data pipeline risks, build CI/CD staging systems, and elevate data work into a science. 💡📊

💼 What You’ll Need (Required)

  • 3+ years working hands-on with Snowflake ❄️ in a real-world, high-volume setup
  • Experience building data pipelines in a medallion/lakehouse architecture 💠
  • Strong grasp of ETL/ELT workflows 🔁 and performance optimizations
  • Expertise in implementing or improving CI/CD flows ⚙️ for data engineering
  • Fluent in English 🏴󠁧󠁢󠁥󠁮󠁧󠁿, written and spoken — collaboration is key 🤝
  • A cool head in production-first environments 🚨 — yes, we ship live 😎
  • Ability to partner closely with Data Analysts 📊 to expose business-critical insights
  • Passion for delivering value, not just code 🫶

🔧 Bonus Points (Nice to Have)

  • Familiarity with tools like dbt 📐, Airflow 🪂, or Fivetran 🔌
  • Know-how of reverse ETL 🔄 and integrations with platforms like Planhat 🧠
  • DataOps experience (monitoring, versioning, testing pipelines) 🛠️

🎯 Sample Projects You Might Work On

  • 🚀 Own and evolve our medallion architecture on Snowflake ❄️
  • ⚒️ Build and refine critical data pipelines — batch & real-time
  • 🔄 Implement a staging environment with proper CI/CD flow
  • 🤝 Collaborate with analysts to make raw data business-ready
  • 🧪 Create safe, testable, and production-grade data processes
  • 🧭 Drive structure in a fast-moving team that’s data at its core

🗓️ Start Date

As soon as possible, we’re ready to move quickly! 🏃‍♂️💨

Suppose you thrive on building pipelines that handle billions of events 📊, love clean architecture that turns raw data into gold 🪙, and want to shape the future of data-driven platforms at scale 🚀. In that case, this is your chance to power the backbone of modern consent tech with us! 🔐✨