Job Openings Senior Data Engineer (GCP) I WFH - Up to 110K Salary

About the job Senior Data Engineer (GCP) I WFH - Up to 110K Salary

Job Expectations:

  • Position Type: Experienced - Mid/Senior
  • Employment Type: Full-Time; Permanent (Direct Hire)
  • Work Setup & Location: WFH - Metro Manila
  • Work Schedule: Mondays - Fridays; Night Shift (Can be mid shift after 3-6 months)
  • Industry: Healthcare

About the Job:

We are looking for a skilled and motivated Senior Data Engineer to design, develop, and implement advanced data warehouse solutions. You will build and optimize data pipelines, models, and infrastructure to support analytics and reporting across Clinical, Patient Accounting, and Corporate functions. This cloud-focused role requires hands-on experience with modern technologies such as Google BigQuery or equivalent. The ideal candidate has strong SQL and data warehousing skills, ensures data quality and performance, drives process improvements, and collaborates with cross-functional teams to deliver reliable insights in a fast-paced healthcare environment.

Qualifications:

  • Bachelor's Degree in Computer Science, Information Technology, or a related field
  • Strong SQL and relational database design/development expertise
  • Proven hands-on experience in data warehousing development (Teradata BTEQ, Oracle PL/SQL, SQL Server T-SQL, Netezza, or equivalent) with the ability and willingness to adapt to Google BigQuery
  • Proficiency in cloud-based data warehouse technologies such as Google BigQuery or similar
  • Skilled in building and optimizing large-scale data sets, data pipelines, and architectures
  • Strong experience in data modeling and data warehouse design
  • Ability to perform root cause analysis and deliver process improvements across structured and unstructured data
  • Excellent analytical and communication skills, with strong interpersonal ability to engage stakeholders at all levels
  • Willing to work on-site on training for 1-2 weeks at BGC, Taguig for IT Asset Collection
  • Willing to work in US hours (graveyard shift)

Preferred (Not Required):

  • Experience in healthcare data systems or patient financial service systems
  • Familiarity with big data tools such as Kafka and Spark
  • Knowledge of workflow management tools (e.g., Airflow) and programming/scripting languages, including Python, Scala, Java, or C++
  • Understanding of hospital information systems and healthcare system integration (Cerner, Meditech, CPSI, Allscripts, EPIC, McKesson, NextGen)

Only shortlisted candidates will be contacted; updating/tailoring CV is highly recommended