Job Openings Data Engineer Snowflake and Databricks -Austin, TX -Hybrid or Remote in the U.S.

About the job Data Engineer Snowflake and Databricks -Austin, TX -Hybrid or Remote in the U.S.

Data Engineer Snowflake and Databricks -Austin, TX -Hybrid or Remote in the U.S.

FinTrust Connect Austin TX Hybrid or Remote

Share Your Resume and Build Your Future!

Join our Talent Community for Austin. Posts in this market highlight Databricks first development with Snowflake serving analytics and sharing, plus strong pipeline ownership and stakeholder skills. Contract ranges trend mid market with steady demand. 

As a Data Engineer you will ship production grade pipelines on Databricks and Snowflake, stand up quality and monitoring, and partner with product and risk teams to land trusted data sets.

Requirements

  • 4 to 7 years building ELT with Python and SQL

  • Databricks PySpark and Delta tables and jobs and notebooks

  • Snowflake performance tuning and tasks and streams and time travel

  • Orchestration Airflow or dbt or Azure Data Factory

  • CI and CD and IaC familiarity

Responsibilities

  • Build scalable ingestion and transformation with tests and documentation

  • Model curated data layers that serve BI and ML and finance

  • Implement monitoring for freshness and volume and schema drift

  • Optimize compute and storage for cost and runtime

Outcomes we track

  • Freshness SLA met 95%+ across critical tables

  • Runtime reduced 30% on top pipelines within 60 days

  • Incident mean time to recover < 30 minutes

Compensation and terms

  • Consultant pay $85 to $150 per hour 

  • Contract Hybrid Austin TX or Remote US W2 or 1099

How to apply

Keywords
Databricks, Snowflake, PySpark, Delta, Unity Catalog, dbt, Airflow, ADF, Python, SQL, Cost Governance, Observability, ELT, CDC, Austin