Job Openings Data Engineer

About the job Data Engineer

To turn data into real value for our organization, we are looking for specialized services that can help us develop data-driven products in Datalake and cloud environments. We need expertise to implement complete data processing pipelines from ingestion and transformation to API exposure and data visualization.

Responsibilities:

  • Design and implementation of complete data processing pipelines, both on-premise and in the cloud.
  • Development of data products for various functional domains.
  • Creation of data ingestion pipelines and integration of basic ML algorithms.
  • Building data products using SQL and NoSQL standards.
  • Interactive visualizations with tools such as Looker, Spotfire, or Tableau.
  • Consulting on technology and service selection and building a solid data tools ecosystem.

Essential Requirements:

  • Minimum 3 years of experience as a Data Engineer.
  • Proven experience with DataLake or Data Warehouse environments (data modeling, data quality, flow monitoring).
  • Minimum 2 years of experience in Python programming.
  • Good knowledge of the following tools and technologies:
    • Google Cloud Platform, GitLab, Airflow, DataProc, IntelliJ, Spark/Scala or equivalent.
  • Exposure to AI/ML tools in development activities (e.g., Vertex AI, GitHub Copilot).
  • Fluent English and experience in international working environments.

Desirable:

  • Experience with AI projects, Dataiku, or Web applications is a strong plus.

Familiarity with Jira and Confluence