About the job Data Engineer
Job Opportunity: Data Engineer (Remote, Americas Time Zones)
Position Type: Full-Time Contract
Duration: 1 Year
Start Date: April 1, 2026
Location: 100% Remote (Americas time zones preferred, close to US Central Time)
About the Role
We are looking for a Data Engineer to design, build, and maintain scalable data systems that power analytics and operational platforms. You'll collaborate with cross-functional teams to develop robust data pipelines and platforms that integrate data from internal systems, third-party providers, and real-time streams.
This is a great opportunity to join a fast-growing, high-tech environment working with clients across Europe and North America.
What You'll Do
- Design, build, and maintain scalable data pipelines and processing systems
- Ingest and process data from APIs, internal systems, third-party platforms, and streaming sources
- Develop batch and near real-time data pipelines
- Build and maintain data models and data warehouse structures
- Write efficient, reusable code for large-scale data processing
- Collaborate with product managers and engineering teams to define data needs
- Participate in Agile ceremonies (standups, sprint planning, retrospectives)
- Troubleshoot and resolve production issues
- Improve code quality, system reliability, and development practices
Who You Are
- Analytical and detail-oriented
- Comfortable working with large-scale data systems
- Collaborative and team-focused
- Curious and eager to learn new technologies
- Strong problem-solving skills in production environments
- Passionate about building reliable and scalable data infrastructure
Required Experience
- 5+ years in Data Engineering or related roles
- Strong programming skills (Python, SQL, Java, or Scala)
- Experience building and maintaining data pipelines and orchestration systems
- Knowledge of data warehousing (relational & dimensional modeling)
- Experience with APIs, real-time data, and structured/semi-structured data
- Experience handling large-scale datasets
- Familiarity with batch and near real-time processing
- Experience working in Agile/Scrum environments
Tech Stack
Data Processing: Python, SQL, PySpark
Pipelines & Orchestration: Apache Airflow / Amazon MWAA, AWS Glue
Cloud & Infrastructure: AWS (S3, Redshift, DynamoDB), Athena, EMR, Lambda
Data Engineering: Data Lakes, ETL pipelines, APIs, real-time ingestion
Nice to Have: Terraform, Talend, Informatica, CI/CD, TDD
Extra Points
- Personal projects demonstrating initiative and technical skills
- Strong communication skills (English C1 or higher required)
- Interest in multidisciplinary and cross-functional collaboration
What We Offer
- Competitive compensation in USD (contractor)
- Fully remote work environment
- Flexible and supportive culture
- Opportunity to work with international clients
- Growth opportunities in a fast-scaling tech company
If you're looking for a challenging environment, impactful work