Job Openings Senior DataOps

About the job Senior DataOps

At BetterWay we are looking for a person with knowledge working as a Senior DataOps.

Benefits 

  • Remote in Colombia
  • Salary 4000-5000USD
  • Contractor
  • Prepaid medicine
  • Life insurance
  • US company with daily English communication
  • We help connect great talent in Colombia with excellent technical challenges and salaries.


Engineering at Grove The Grove product/engineering/design team is rooted in a data and goal-driven culture. We love to take ownership of big business and technology challenges in our high growth phase, and solve them quickly and robustly. We are divided into small, independently-executing Agile teams to allow maximum autonomy and speed. We build in a modern, evolving stack (python, Vue, Docker, Kubernetes, AWS), leveraging new technologies as needed. We use CI/CD to deploy several times a day so that we can impact the business as quickly as possible. We operate with shared trust and no egos, and enjoy being 'in this together' to collaborate on the challenges of a rapidly scaling business.Your Role and Impact

  • Build and Architect low latency data infrastructure and systems to support all of the organizations data needs
  • Create and deliver algorithms, simulators, and forecasts that allow us to ship millions of customer orders a month
  • Build, maintain, and troubleshoot ETL pipelines for the data sources that power the company
  • Plan for the future and ensure our data platform remains scalable and fast
  • Build automated testing, performance evaluations, monitoring tools and dashboards
  • Work with and build APIs from/for other microservices and outside services
  • Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee of this job. Duties, responsibilities and activities may change at any time with or without notice.

About You

  • 3+ years of data engineering in a production environment
  • 2+ years of experience coding in a data capacity (Python)
  • 2+ years of experience SQL (we use Postgres MySQL)
  • Strong experience with data modeling, query optimization and tuning, etc
  • Production experience with data engineering technologies - such as Airflow, Aribyte, dbt, Snowflake, etc
  • Advanced proficiency using a programming language like Python, SQL, (Java/Scala optional, but great if you have it!), data warehousing and processing systems such as Snowflake, Spark, DBT, and workflow orchestration platforms such as AirFlow
  • Production experience with Kubernetes or other container orchestration a plus
  • Production experience with terraform a plus
  • Experience in CI/CD/CT pipelines implementation (GitHub Actions)
  • Ability to work independently
  • Good knowledge of Bash and Unix command-line toolkit.