Job Description:

Job Summary

This role focuses on designing, developing, and maintaining cloud-based software systems that support the processing, storage, retrieval, and archiving of large-scale scientific datasets. The position supports mission-critical data processing pipelines and plays a key role in ensuring the reliability, performance, and accuracy of scientific data products used by researchers worldwide.

The role is part of a data management team responsible for end-to-end data workflows across multiple space science missions. The engineer will contribute to both new development and ongoing operational support, including troubleshooting, performance optimization, and production issue resolution.

Key Responsibilities

  • Design and develop software systems for large-scale data processing pipelines

  • Develop and maintain interfaces between databases and data processing systems

  • Improve processing algorithms and optimize performance for scientific workloads

  • Participate in day-to-day operational support, including troubleshooting, debugging, and resolving production issues

  • Contribute to maintaining stable, scalable, and reliable data processing environments

Experience, Skills & Qualifications

  • Bachelors degree in Computer Science, Engineering, or a related field

  • 8+ years of hands-on experience in software engineering and data management

  • Strong expertise in Python programming

  • Experience working with workflow management systems such as Airflow

  • Strong experience with relational databases

  • Proficiency with automation tools and CI/CD pipelines

  • Experience working with cloud platforms and services, including storage, compute, container, messaging, and database services

  • Strong problem-solving skills and ability to work effectively in a collaborative team environment

Working Place:

Baltimore, Maryland, United States

Company :

2026 Feb 12th Virtual - Space Telescope Science Institute