About the job Data Engineer
About the Company and Role
We are looking for a highly motivated Data Engineer to join a small and growing team at a high-performing trading firm. This role is ideal for those who want to see tangible impact from their work. You will be collaborating directly with quants and portfolio managers to architect data flows and build resilient ETL systems that power real-time decision-making.
This is not a back-office or purely maintenance role. You will help shape how the firm ingests, processes, and leverages data across multiple strategies. The position offers significant ownership and influence for engineers who enjoy seeing the direct results of their technical contributions. This is a rare opportunity.
We are especially interested in candidates who prefer working on-premises infrastructure, value clarity and performance in their code, and have strong instincts around data reliability and system orchestration.
Tech Stack: Python, SQL, Airflow, Unix/Linux, Postgres, MongoDB, on-premises infrastructure, Git, data warehousing, data modelling.
Location: London, UK (hybrid format, with a requirement of 3 days per week in the office).
We are offering:
- Competitive, market-leading compensation package (aligned with top-tier trading firms);
- Private health, vision, and dental insurance;
- Flexible paid sick leave;
- Impact-driven environment where contributions are visible and valued;
- The opportunity to work closely with PMs, quants, and senior engineering leadership;
- High level of ownership and the ability to directly influence data architecture and strategy.
What kind of qualifications we are looking for:
- Strong knowledge of Python and SQL;
- Experience working in on-premises environments (cloud not required or preferred);
- Hands-on experience with Airflow (or similar orchestration tools);
- Proficiency with Unix/Linux systems;
- Familiarity with relational and NoSQL databases (Postgres, MongoDB, etc.);
- Understanding of data modelling principles and trade-offs (normalisation vs denormalisation);
- Comfortable with Git workflows and collaborative engineering environments;
- Bachelor's degree in Computer Science, Engineering, or related technical field.
Responsibilities:
- Partner with PMs and Quant Developers to translate trading and research needs into reliable data pipelines;
- Expand and maintain the internal data warehouse;
- Build and optimise ETL workflows using orchestration tools (Airflow preferred);
- Improve robustness, speed, and scalability of internal data tools;
- Implement monitoring, anomaly detection, and data quality checks;
- Manage data entitlements and vendor relationships;
- Evaluate and propose tooling for data ingestion, streaming, and management.
Nice to have requirements:
- Experience working in fast-paced tech or financial environments;
- Background in data engineering for trading or analytics products;
- Java skills are a bonus;
- Familiarity with Hadoop or other on-prem data solutions;
- Master's degree in a quantitative or technical field.
If this sounds like the kind of environment where you do your best work, we would love to meet you.
You can apply directly through the website or message our recruiter on Telegram: @ekaterinafilimonova_hr.