Job Openings Data Engineer(929)

About the job Data Engineer(929)

Company Description

We are a consulting company with a bunch of technology-interested and happy people!

We love technology, we love design and we love quality. Our diversity makes us unique and creates an inclusive and welcoming workplace where each individual is highly valued.

With us, each individual is her/himself and respects others for who they are and we believe that when a fantastic mix of people gather and share their knowledge, experiences and ideas, we can help our customers on a completely different level.

We are looking for you who is immediate joiner and want to grow with us!

With us, you have great opportunities to take real steps in your career and the opportunity to take great responsibility.

Job Description:

We are looking for an experienced Data Engineer to join our team in Bangalore. The ideal candidate should have a strong experience in Python, Java /or Scala. along with expertise in cloud platform GCP.

  • Take end-to-end responsibility to build, optimize and support of existing and new data products towards the defined target vision.
  • Ensure that our built data products work as independent units of deployment and non-functional aspects of the data products follow the defined standards for security, scalability, observability, and performance.
  • Work close to the Product Owner and other stakeholders around vision for existing data products and identifying new data products to support our customer needs.
  • Work with product teams within and outside our domain around topics that relate to the data mesh concept.
  • Evaluate and drive continuous improvement and reducing technical debt in the teams.
  • Maintain expertise in latest data/analytics and cloud technologies . Develop and maintain SSIS Cubes for multidimensional data modeling and efficient OLAP querying to support business intelligence needs.
  • Create interactive dashboards and reports in Power BI, integrating data from cloud and on-prem sources to deliver actionable insights or Equivalent tool for dashboard/reporting purpose.


Key Responsibilities:

  • Design, develop, and maintain robust data pipelines using Azure Data Factory, and related services.
    Build and optimize scalable data models in BigQuery, ensuring high-performance querying and storage efficiency.
  • Ensure data governance, security, and compliance best practices across cloud environments.
  • Implement monitoring, logging, and performance tuning across pipelines and data stores.
  • Automate workflows and data orchestration using tools such as Azure Logic Apps, PowerShell, etc.
  • Assist in migrating workloads between GCP and Azure as needed.
  • Implement data integration solutions between Azure and GCP BigQuery

Required Skills :

  • 6–9 years of experience in data engineering with a focus on Azure Data Platform.
    Strong experience in GCP BigQuery, including data modeling, query optimization, and storage partitioning.
  • Proficiency with Azure Data Factory, or similar tools.
  • Working knowledge of SQL, Python, and scripting for data workflows.
  • Experience integrating and transforming large datasets across cloud services. Knowledge of data lake architecture, structured and unstructured data processing.
  • Understanding of cloud networking, security, encryption, and IAM policies.
  • Familiarity with CI/CD pipelines for data solutions using Git
  • Excellent problem-solving skills and attention to detail.

Start: Immediate

Location: Bangalore (Hybrid)

Form of employment: Full-time until further notice, we apply 6 months probationary employment.

We interview candidates on an ongoing basis, do not wait to submit your application.