We are a small data-driven company looking for a senior or mid-level ETL developer. We offer an organized remote work environment with colleagues across the world. We don’t ask you to know everything, but we ask that you always stay ready to learn. In exchange we promise to provide you with a dynamic work environment that promotes sharing, learning, and a fondness for the power of data. Learn more below! We can’t wait to hear from you.
ABOUT THE JOB
- Job Type: Full time and permanent
- Job Location: Remote with the option to find co-workers and offices in Toronto or Montreal.
- Timezone: America
- Experience: Senior and mid-level (5+ years of relevant experience)
- Compensation: Based on experience
- Industry: Consulting Data services
- Company Size: 5 to 15 employees
WHO ARE WE?
At our client, we believe that access to data is key to understand today’s challenges.
That’s why, we have been designing, developing and monitoring hundreds of data pipelines with a focus on data engineering and operational excellence since 2014. Our goal? Make sure data is within everyone’s reach. Our clients rely on our data feed to gather new insights and build products or services. We help our customers transform the insurance, retail, compliance, and manufacturing industries. We provide them with data strategy, system architecture, implementation and outsourcing services.
And to achieve our mission, we:
- Always put our customer’s needs at the center of our operation,
- Keep data quality in mind at all time, and
- Invest early on DevOps and DataOps best practices to automate our processes
WHO ARE YOU?
You are an ETL developer or data engineer with significant experience in scripting languages, developing and maintaining data pipelines. You previously worked with large datasets and messy data with at least a few millions records.
You believe coding is only part of your job, and you pay attention to code quality, documentation, unit testing, and data quality. You are comfortable speaking directly with the client to present your approach and collect feedbacks.
You know when to ask for help, and are not afraid to learn new technologies.
And like us, you believe data is the future.
- Work with the product owners and our customers to develop a technical vision for the project, including ETL specifications, workflows and data model definition.
- Design and build ETL flows, using Python, Bash or SQL
- Work with the DevOps team to deploy your creations to production
- Document operation procedure to execute, monitor and maintain data pipelines
- Monitor and test integrations for multiple customers.
- Support on-going projects
- Carry out research & development about new data acquisition and transformation technology
Remember: You must be comfortable working directly with the customer in a consulting model.
SKILLS AND REQUIREMENTS
- Bash, Batch
- Comfortable with the command-line
- Knowledge of the different data formats (CSV, JSON, YML, XML)
- Excellent understanding of the SQL language
- Understanding of API design and usage
- Good knowledge of web development (though we will not ask you to develop web apps)
- Familiar with development concepts and best practices (Kanban, DRY, Design patterns)
- Good written and spoken English
- Willing to learn new tools and technology.
- You have experience working with an all-remote or distributed team.
- You know other languages such as Java, Node, or PHP.
- You’re excited by ETL frameworks like Talend, Pentaho, or Alteryx.
- You have previous experience with web scraping.
- You know Linux or AWS
- You have experience with other database engines
- You have experience with Elastic Search, Kibana
WHAT WE OFFER
- Direct access to our senior developers and founder. We all work together, and we want our people to thrive.
- An efficient and well-organized remote environment, with REAL processes and experience. (We were “in” before remote work was even “in.”)
- Flexible schedule.
- Work-life balance.
- Documented projects and specifications.