Job Openings Lead Data Engineer – Fabric & Power BI | Federal Contract

About the job Lead Data Engineer – Fabric & Power BI | Federal Contract

Lead Data Engineer

Australian citizenship required. No PR or Work Visa holders. Must have a Baseline or higher security clearance.

What to Submit

  1. A tailored resume in docx format
  2. A one page (5000 character) summary response to the selection criteria below.

RFQ Details

  • RFQ ID: LH-06425
  • Agency: National Archives of Australia
  • Closing Date: Monday, 11 May 2026 – 11:59pm (Canberra time)
  • Estimated Start Date: Wednesday, 01 July 2026
  • Initial Contract Duration: 6 months
  • Extension Term: 6 months
  • Number of Extensions: 1
  • Experience Level: Lead – EL1 equivalent
  • Security Clearance: Current Baseline or higher.
  • Location of Work: ACT
  • Working Arrangements: Hybrid working arrangements of both in office and working remotely. Some mandatory travel may be required for candidates outside ACT.
  • Maximum Hours: 40 hours per week

Job Details

The Unified Data View project aims to aggregate, unify and present collection metadata from multiple sources into a single cohesive view to support decision making and data accessibility. The National Archives of Australia is seeking a Data Engineer to support delivery of this initiative, including implementation of a data warehousing solution aligned to archival business rules and practices.

Key Duties and Responsibilities

  • Design and maintain scalable data solutions and infrastructure
  • Configure and optimise data warehouse architecture and models
  • Develop and maintain source-to-target data mappings
  • Build and operate ETL/ELT pipelines for ingestion and transformation of data
  • Identify and prioritise datasets for reporting and analytics
  • Implement data validation, cleansing and governance processes
  • Establish and monitor data quality standards
  • Develop Power BI reports and visualisations
  • Collaborate with stakeholders to align business and technical requirements
  • Apply APS values and organisational standards

Technical Skills

  • Strong data engineering skills including data modelling and ETL/ELT pipelines
  • Proficiency in Microsoft Fabric (mandatory)
  • Experience with Power BI including DAX and data modelling (mandatory)
  • Experience with Python for data processing and automation (mandatory)
  • Familiarity with PySpark, Pandas and Fabric management
  • Experience with Azure DevOps and Git repositories
  • Knowledge of Azure architecture, optimisation and cost management
  • Experience with data governance, validation and compliance frameworks

Selection Criteria

The buyer has specified that each candidate must provide a one page pitch to address all criteria specified. This is equal to 5000 characters.

Essential criteria

  1. Technical Expertise Proven capability in designing and optimising scalable data solutions in Microsoft Fabric and Azure, including the architecture and configuration of enterprise data warehouses. Strong expertise in data mapping and modelling, producing accurate ERDs, domain classifications, and table groupings that reflect business processes, while analysing view logic, data lineage, and cross system dependencies to ensure transparency and integrity. Skilled in rationalising complex data environments by analysing and consolidating system knowledge into clear, consumable artefacts for technical and leadership stakeholders. Knowledge of and understanding of data modelling, power query and data analysis expressions (DAX). Familiarity with Pyspark, Pandas, Microsoft Fabric management CLI tools and schema design for interoperability is desirable Relevant certifications in Microsoft Fabric, Azure, Data Engineering, or Business Intelligence.
  2. Cloud & DevOps Demonstrates experience using Azure DevOps for version control, task management, branching and merging. Utilising Git repositories, and Fabric deployment pipelines Understanding of cloud architecture principles, performance optimisation, and cost management in Microsoft Azure environments.
  3. Project Delivery Experience Backlog prioritisation such as balancing business value with technical dependencies. Testing support such as writing test scenarios, validating data outputs, supporting UAT, and performing SQL based checks. Documentation discipline such as maintaining mapping documents, lineage diagrams, and requirements traceability.
  4. Data Governance & Security Knowledge of data quality frameworks, validation, cleansing, and governance processes. Awareness of security, privacy, and compliance requirements relevant to government data (e.g., Archives Act, Privacy Act).
  5. Analytical & Visualisation Skills Experience creating data maps by extracting, analysing, and documenting view logic, to form a cohesive view of source systems and source to target mapping Use of critical thinking, curiosity and analytical abilities to design and develop interactive dashboards and reports in Power BI Strong data analysis and storytelling skills to support decision-making.
  6. Quality Assurance and Deliverable Management Ensures that project and product quality reviews are conducted on schedule and according to procedure. Manages deliverables to be completed within agreed cost, time, and resource constraints, and ensures formal acceptance by stakeholders.

Desirable criteria

  1. Information Management and Decision Support Captures and disseminates technical and business information effectively. Facilitates business decision-making processes and provides informed feedback to promote understanding across stakeholder groups.
  2. Security Clearance Preference is for the candidate to hold a Baseline Security Clearance due to the tight timeframe for Project Delivery however will consider candidates that have the ability to obtain and maintain a Baseline Security Clearance.
  3. Desirable Skills and Qualifications Relevant certifications in Microsoft Fabric (DP-600, DP-700) Power BI (PL-300), Azure (AZ-900), Data Engineering, or Business Intelligence Knowledge of with metadata management and interoperability standards. Understanding of collection data within the Galleries, Libraries, Archives and Museums sector would be advantageous Familiarity with RecordSearch, Preservica, and Mediaflex is advantageous but not essential