Job Openings Sr.Data Engineer/Scientist

About the job Sr.Data Engineer/Scientist

TO GET TIMELY UPDATES ON JOB OPENINGS, FEEL FREE TO JOIN OUR COMMUNITY @ https://www.industryacademiacommunity.com (4L+ MEMBERS FROM 35+ COUNTRIES)  


Looking for Python Ninjas having 5 - 8 years of experience, to showcase their expertise in a stimulating environment, geared towards building cutting-edgeproducts and services. They should have a knack for data processing, scripting and should be excited about delivering a scalable, high-quality component-based model development framework using big data. They should be Passionate,curious and innovative to think beyond the ordinary.  

They are someone who can start immediately or have 30 days notice period.  

Job Responsibilities:  

  • Collaborating across an agile team to continuously design, iterate, and develop big data systems.  
  • Working with their engineering team to support custom solutions offered to the product development. 
  • Extracting, transforming, and loading data into internal databases and Hadoop 
  • Understanding their data sets and how to bring them together 
  • Filling the gap between development, engineering and data ops 
  • Optimizing our new and existing data pipelines for speed and reliability 
  • Creating, maintaining and documenting scripts to support ongoing custom solutions  
  • Deploying new products and product improvements.  
  • Understanding of best practices, common coding patterns and good practices around 
  • Storing, partitioning, warehousing and indexing of data  
  • Documenting and managing multiple repositories of code 

Mandatory Requirements:  

  • Hands-on experience in data pipelining and ETL (Any one tools: Hadoop, BigQuery, RedShift, Athena) and in AirFlowFamiliar with pulling and pushing files from SFTP and AWS S3 
  • Familiarity with AWS Athena and Redshift is mandatory 
  • Familiarity with SQL programming to query and transform data from relationalDatabases 
  • Experience in reading the data from Kafka topic (both live stream and offline) 
  • Experience with any Cloud solutions including GCP / AWS / OCI / Azure 
  • Familiarities with Linux (and Linux work environment).  
  • Excellent written and verbal communication skills.  
  • Experience in PySpark and Data frames 
  • Experience with SQL and NoSQL databases (MySQL, Cassandra)  

Preferred Requirements:  

  • Know your way around REST APIs.[Able to integrate not necessary to publish.  
  • From a product experience background 

Qualities: 

  • Excellent organizational skills, including attention to precise details.  
  • Strong multitasking skills and ability to work in a fast-paced environment  

Eligibility Criteria:  

  • 5 years experience in database systems.  
  • 5+ years experience with Python to develop scripts.  
  • Bachelors in IT or related field 

What's for the Candidate

  • Compensation as per industry standards or based on the experience and last CTC
  • Paid Leave

Job Location: Hyderabad, WFO 5 days/week

Note: For our vision, our Flagship Event ' IAC VISION 2030' would provide employment and career opportunities for millions of job-ready interns, freshers and professionals in our Industry Academia Community (IAC).

By submitting your application you confirm that you are a member of IAC or give your consent to add you to the IAC platform as a member of Industry Academia Community.