Job Openings Data oriented Software Engineer
About the job Data oriented Software Engineer
- Design, develop, test, deliver and support cloud-based workflow automation tools which service the internal data ingestion needs of external-facing customer applications. Opportunities exist for front-end, middle-tier and back-end development.
- Build and maintain the infrastructure required to extract, transform and load data from a variety of sources. Create optimal data schemas, pipelines and architectures in a hybrid cloud environment.
- Assemble large and complex data sets, and secure the data through its entire pipeline.
- Develop unit tests, integration tests and build scripts to support the continuous integration and delivery of bug-free releases.
- Create intuitive front-end user interfaces.
- Collaborate with internal stakeholders on their data needs and data-related technical issues.
- Collaborate with team engineers and serve as a technical resource for them.
- Exercise creativity in continuously pursuing improvement of the product development lifecycle. Identify, design and implement internal process improvements such as automating manual data processes, optimizing data delivery and ensuring scalability.
- Create analytics tools utilizing the data pipeline to provide data science team members with actionable insights to optimize operational efficiencies.
- Derive satisfaction and enjoyment from each days work.
- Strong technical background with a mix of data, programming and workflow automation skills.
- Experience building and optimizing data pipeline architectures, data ingestion and workflow management tools.
- Working knowledge and experience in current .NET development technologies (C#, ASP.NET, .NET Core). Demonstrable knowledge and application of coding standards.
- Working knowledge and experience in Microsoft SQL Server and T-SQL (or other comparable relational database and querying language). Bonus points for experience with SSIS.
- Working knowledge and experience in web service programming and microservices.
- Working knowledge and experience in public cloud technologies (Azure, AWS, GCP) Azure preferred. A big plus would be experience with Azure Data Factory.
- Working knowledge of source control protocols (GitLab, GitHub and/or Azure DevOps Server)
- Ability to collaborate successfully with other internal teams in a dynamic environment.
- Strong creativity and problem-solving skills. Perseverance in solving complex problems.
- Interest in learning and growing technically, and likewise having a willingness to share knowledge with other engineers to help them learn and grow in their field.
- Excellent written and oral communication skills. Fluent English necessary.
- Highly organized and self-disciplined to operate successfully in a fully remote environment.
- Working knowledge of message queuing, stream processing and highly scalable big data stores.
- Working knowledge and experience in Python.