Job Openings Proxy Product Owner – Data (Azure/Snowflake/Power BI) - Hybrid Lisbon (2-3 days/week office)

About the job Proxy Product Owner – Data (Azure/Snowflake/Power BI) - Hybrid Lisbon (2-3 days/week office)

ABOUT THE OPPORTUNITY

Join the data team of a large-scale, internationally recognized industrial group during an active and high-visibility build phase of a strategic Data Lake and BI platform. This is a hands-on delivery role with real ownership — sitting at the intersection of business stakeholders and technical data teams, translating KPI logic and reporting needs into concrete, deliverable backlog items.

Contract is consulting-based, based in Lisbon with a hybrid model (2–3 days in office)

PROJECT & CONTEXT

You'll be embedded in a cross-functional Agile team delivering a Snowflake-based Data Lake with Azure Data Factory orchestration and Power BI reporting — a platform being built to serve multiple business units across a complex multinational organisation.

The role is explicitly focused on the BUILD phase: enriching the Jira backlog with data integration specs, transformation rules, and acceptance criteria; conducting UAT in Snowflake and Power BI; contributing to data modeling and KPI calculation logic; and monitoring pipeline orchestration. You'll work directly alongside the Product Owner and report progress to cross-functional stakeholders who range from technical data engineers to business analysts and finance teams.

This is not a purely functional role — you need to understand the data well enough to challenge assumptions, validate outputs, and contribute meaningfully to Snowflake data modeling and Power BI report architecture. But deep engineering expertise is not required — what matters is bridging the gap between business intent and technical delivery.

WHAT WE'RE LOOKING FOR (Required)

  • Junior to Mid level — 3+ years of experience in data, BI, or analytics projects, ideally with some exposure to a Product Owner or proxy PO function
  • Hands-on knowledge of Snowflake — including data modeling and running validation queries
  • Working knowledge of Azure Data Factory — understanding orchestration flows, pipeline scheduling, and data source integration
  • Proficiency in Power BI — building and refining data models, designing dashboards, validating KPI outputs
  • SQL proficiency — comfortable writing and reviewing queries for data analysis and UAT validation
  • Proven experience working in Agile / Scrum environments with Jira — backlog management, ticket refinement, sprint ceremonies
  • Ability to write clear functional specifications and acceptance criteria that technical teams can act on
  • Strong stakeholder management skills — you can facilitate requirements sessions with business users and translate outputs into development-ready tasks
  • Good general IT knowledge — APIs, protocols, data flows — enough to have credible conversations with data engineers
  • English B2+ — working language for documentation and cross-team communication

NICE TO HAVE (Preferred)

  • Formal Scrum or Product Owner certification (PSPO, CSPO, or equivalent)
  • Prior experience in large multinational or matrix organizations where stakeholder complexity is high
  • Background in KPI definition and data governance processes
  • Exposure to data lake architecture patterns or medallion/layered data platform designs
  • Experience running UAT sessions with non-technical business stakeholders
  • Familiarity with data pipeline monitoring and alerting practices
  • Previous work in industrial, manufacturing, or procurement data domains