Job Openings Senior Data Engineer – AI & Cyber Intelligence (Fractional | Remote)

About the job Senior Data Engineer – AI & Cyber Intelligence (Fractional | Remote)

PLEASE READ THE FULL JD BEFORE APPLYING. INCOMPLETE APPLICATIONS MAY NOT BE CONSIDERED.

Senior Data Engineer – AI & Cyber Intelligence (Fractional | Remote)

Remote | Fractional Engagement (Approx. 8 hours per week)

About the Opportunity

Our client is an emerging technology company developing advanced AI-driven cyber-intelligence and risk-analysis platforms for highly regulated industries.

As the platform evolves, they are seeking a Senior Data Engineer (Fractional) to support the development of a scalable, secure data infrastructure that underpins real-time analytics, AI models, and risk intelligence capabilities.

This opportunity is suited to experienced data engineers who are comfortable contributing within early-stage environments, where building foundational systems and working with high levels of autonomy are essential.

About the Role

This role is structured as a fractional technical engagement (approximately 8 hours per week).

You will play a key role in designing and building the data backbone supporting AI-driven cybersecurity and compliance systems, working alongside engineering and AI teams to enable scalable, production-ready data pipelines.

The focus is on creating robust, secure, and AI-ready data infrastructure capable of supporting advanced analytics and machine learning applications.

Key Responsibilities

Data Engineering & Architecture

  • Design and build scalable data pipelines across cloud environments (AWS/Azure)
  • Develop and optimise data lakes, warehouses, and data models
  • Implement ETL/ELT workflows, ensuring reliability and scalability
  • Maintain data quality, lineage, and observability across pipelines

AI & Analytics Enablement

  • Collaborate with AI/ML teams to provide high-quality datasets
  • Support LLM and advanced analytics use cases through efficient data processing
  • Enable analytics and reporting through well-structured datasets

Security, Compliance & Governance

  • Implement secure, privacy-first data architecture
  • Align data systems with frameworks such as NIST, ISO 27001, SOC 2, and DORA
  • Establish governance, access control, and auditability standards

Operational Delivery

  • Monitor and optimise pipeline performance and cost
  • Contribute to DevOps/MLOps practices for data workflows
  • Evaluate and integrate tools to improve the data platform

Who This Opportunity Is For

You are likely to be a strong fit if you:

  • Have 5+ years of experience in data engineering or related roles
  • Have built production-grade data pipelines in cloud environments
  • Are highly proficient in Python and SQL
  • Have experience with tools such as Airflow, dbt, Kafka, Spark, or Databricks
  • Have worked in cybersecurity, fintech, or regulated environments (preferred)
  • Understand how data supports AI/ML workflows and advanced analytics
  • Are comfortable working independently in a lean, fast-moving environment

Engagement Structure

This opportunity is structured as a fractional engagement (approximately 8 hours per week), offering flexibility and the opportunity to contribute to the development of a growing AI-driven cybersecurity platform.

The role is designed for professionals interested in contributing to an evolving technology environment, with scope to expand involvement as the platform progresses through future growth stages. 

This opportunity is best suited to professionals who are comfortable participating in early-stage ventures where engagement structures evolve alongside company growth.

Further details will be discussed with the shortlisted candidates.

On Offer

  • Opportunity to build the data foundation for an advanced AI-driven platform
  • Flexible, remote working structure
  • Strategic technical influence within a growing technology environment
  • Collaboration with experienced professionals across AI, cybersecurity, and financial services

Application Instructions

Please submit:

  • Your CV
  • A brief summary of your experience building data pipelines in cloud environments