Job Openings
Software Engineer - Snowflake
About the job Software Engineer - Snowflake
We are looking for a talented and motivated Software Engineer with experience in Confluent, Snowflake and QLIK to join our team. This role involves designing, developing, and maintaining real-time data streaming and cloud-based data warehousing solutions using Confluent Kafka, Snowflake. The ideal candidate will work closely with cross-functional teams to build scalable, efficient, and secure data pipelines that drive business intelligence and analytics.
Responsibilities:
- Data Pipeline Development: Design, develop, and optimize real-time and batch data pipelines using Confluent Kafka and Snowflake.
- Streaming Data Processing: Implement event-driven architectures leveraging Kafka Streams, ksqlDB, and Kafka Connect.
- Data Integration: Integrate structured and semi-structured data from multiple sources into Snowflake for analytics and reporting.
- Database Management: Develop, optimize, and maintain Snowflake-based data warehouses, ensuring high availability and performance.
- Security & Compliance: Implement data security best practices, including encryption, role-based access control (RBAC), and compliance with GDPR, HIPAA, or other relevant regulations.
- Optimization: Optimize query performance and storage efficiency in Snowflake using clustering, partitioning, and caching techniques.
- Testing: Conduct unit testing and data validation to ensure accuracy and reliability of data pipelines.
- Troubleshooting: Diagnose and resolve issues related to Kafka messaging, Snowflake storage, and data transformations.
- Documentation: Prepare and maintain technical documentation, including architecture diagrams, data flow processes, and best practices.
- Collaboration: Work with Data Scientists, BI Teams, and Cloud Engineers to provide real-time and batch data solutions.
- Maintenance: Monitor and maintain the stability of streaming pipelines and data warehouse operations.
Qualifications:
- Strong understanding of real-time data streaming and event-driven architectures.
- Hands-on experience with Confluent Kafka (Kafka Streams, Kafka Connect, ksqlDB).
- Experience in Snowflake data warehouse development and administration.
- Proficiency in SQL, Python, or Scala for data processing and transformation.
- Familiarity with cloud platforms (AWS, Azure, GCP) for data pipeline orchestration.
- Experience with ETL/ELT processes and data pipeline automation.
- Knowledge of CI/CD for data pipelines, version control (Git), and DevOps tools.
- Strong problem-solving and analytical skills.
Nice to have:
- Confluent Kafka Certification or Snowflake Certification.
- Experience with Apache Airflow, dbt, or other data orchestration tools.
- Knowledge of Big Data Technologies (Spark, Flink, or Druid).
- Understanding of GraphQL, REST APIs, and WebSockets for data integration.
- Experience working in finance, healthcare, or other regulated industries.