Job Openings Technical Lead Platform and DataLake Engineer

About the job Technical Lead Platform and DataLake Engineer

Job Location: 100% remote in Romania

Recruitment process: 

  • HR Screening 
  • Technical Interview

Role description:

We are looking for a talented Senior/TL Platform and DataLake engineering in who will not only write code but also deliver exceptional product capabilities for our clients. Your contributions will play a pivotal role in achieving our passion for client success, solidifying our position as a leading product development services company.

Responsibilities:

  • Develop a cutting-edge application for our customers.
  • Design and implement scalable foundational services to support data pipeline processing, search functionality, user management, and other customer-facing features.
  • Work on ML infrastructure and Generative AI applications to advance scientific use cases.
  • Build and deliver high-quality products using Agile software development methodologies.
  • Engage in continuous learning, growth, and professional development.
  • Articulate your vision to peers and leadership while being open to constructive feedback and maintaining resilience.

    Profile: 

    • 8+ years of experience in the software development industry, preferably in data engineering, data warehousing and data analytics companies and teams.
    • Experienced in designing and implementing complex, scalable data pipelines/ETL services.
    • Expert level of Javascript, Python and Typescript.
    • Extensive in cloud-based data storage and processing technologies, particularly AWS services such as S3, Step Functions, Lambda, and Airflow.
    • Expert level of understanding and hands-on experience with Lake House architecture.
    • Ability to articulate ideas clearly, present findings persuasively, and build rapport with clients and team members.
    Nice to have:
    • Knowledge of basic DevOps and MLOps principles
    • 3+ years of experience with the DataBricks ecosystem.
    • Expert level of experience with Spark/Glue and Delta tables/iseberg.
    • Working knowledge of Snowflake
    • Experience in working with Data Scientists and ML Developers
    • Hands-on experience with data warehousing solutions and ETL tools.