Job Openings Senior Specialist - Data Engineering (Hyderabad, Pune)

About the job Senior Specialist - Data Engineering (Hyderabad, Pune)

Location: Hyderabad / Pune / Noida / Any

No. of positions open: 1

Years of Experience: 5-9 Years

Prerequisite skills:

GCP- BigQuery, Dataproc, Airflow, Pyspark, Python, SQL.

Should be willing to work in 2nd shift

Job Description:

We are looking for a highly skilled Data Engineers with 5 to 9 years of experience in data engineering specializing in PySpark, Python, GCP, IAM CS DataProc BigQuery SQL Airflow and building data pipelines Handling TerabyteScale Data Processing The ideal candidate will have a strong background in designing developing and maintaining scalable data pipelines and architectures

Key Responsibilities

  • Design develop and maintain scalable data pipelines using PySpark, Python, GCP, and Airflow
  • Implement data processing workflows and ETL processes to extract transform and load data from various sources into data lakes and data warehouses
  • Manage and optimize data storage solutions using GCP services
  • TerabyteScale Data Processing Developed and optimized PySpark code to handle terabytes of data efficiently Implemented performance tuning techniques to reduce processing time and improve resource utilization
  • Data Lake Implementation Built a scalable data lake on GCP CS to store and manage structured and unstructured data
  • Data Quality Framework Developed a data quality framework using PySpark and GCP to perform automated data validation and anomaly detection Improved data accuracy and reliability for downstream analytics
  • Collaborate with data scientists analysts and other stakeholders to understand data requirements and deliver highquality data solutions
  • Perform data quality checks and validation to ensure data accuracy and consistency
  • Monitor and troubleshoot data pipelines to ensure smooth and efficient data processing
  • Stay updated with the latest industry trends and technologies in data