South Jakarta, Jakarta, Indonesia

Head of Data Engineer

 Job Description:

  • Duties and Responsibilities: 
  • ·         Drive the advancement of Enterprise data infrastructure by designing and implementing the underlying logic and structure for how data is set up, cleansed, and ultimately stored for organizational usage
  • ·         build robust systems and reusable code modules to solve problems across the team and organization with an eye on the long-term maintenance and support of the application
  • ·         Work with the latest open source tools, libraries, platforms and languages to build data products enabling other analysts to explore and interact with large and complex data sets
  • ·         Partner with Product Owners and cross functional teams in a collaborative and agile environment
  • ·         Design rich data visualizations and interactive tools and solutions to communicate complex ideas to customers and company leaders
  • ·         Collaborate across the enterprise to enable and share best practices and reusable and scalable tools and code for our analyst community
  • ·         Establish forward looking Data and technology objectives and developing the strategies and actions for the team to achieve them
  • ·         Mentor other engineers and develop technical knowledge and skills to keep the enterprise on the cutting edge of technology.
  • ·         Create and maintain optimal data pipeline architecture,
  • ·         Assemble large, complex data sets that meet functional / non-functional business requirements.
  • ·         Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • ·         Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • ·         Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • ·         Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • ·         Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
  • ·         Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • ·         Work with data and analytics experts to strive for greater functionality in our data systems.
  •  
  •  
  •  
  •  
  •  
  •  
  • Qualifications and Requirements :
  •  
  • ·         Bachelor’s degree in Computer Science, Electrical Engineering, Software Engineering, Computer Information Systems, Engineering, Communications Technology, or a related field of study
  • ·         Six ( 6) years of experience in the position offered or in a related consulting, analyst, or engineer role. Alternately
  • ·         The experience must include at least six (6) years of experience working with relational databases and data mining
  • ·         Four (4) years of experience with data engineering programming language (e.g. Python, R, Scala, SQL and/or SAS)
  • ·         Four (4) years of experience using data visualization tools (e.g. Tableau, d3.js or similar tools), two (2) years of experience with big data
  • ·         Two (2) years of experience with Cloud computing and/or DevOps.
  • ·         Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • ·         Experience with relational SQL and NoSQL databases, including PostgreSQL, MySQL, SQL Server, etc.
  • ·         Experience in working with BI tools like Tableau is a plus
  • ·         Experience in working with ETL and at least one of their tools (SSIS, Pentaho, AWS EMR, Spark, etc)
  • ·         Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • ·         Experience with AWS cloud services: EC2, EMR, RDS, Redshift
  • ·         Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • ·         Fluently in english

  Required Skills:

ETL Best Practices Workflow Data Visualization NoSQL SAS Scala Information Systems DevOps Tableau Software Engineering Hadoop Metrics Mining Big Data Data Mining Business Requirements Engineer Electrical Engineering Analytics Cloud Computing PostgreSQL Engineers R Infrastructure Consulting Architecture Databases Computer Science Programming Python Software MySQL SQL Science Design Engineering Maintenance Business English Management