About the job Solution Architect
We are seeking a highly experienced Solutions Architect specializing in Data Engineering and Analytics to design, build, and deploy scalable cloud-based data platforms. The ideal candidate will have deep expertise in Azure and Databricks, strong architectural knowledge of Data Lakehouse solutions, and a proven ability to guide teams in implementing enterprise-grade data platforms.
Key Responsibilities
-
Architect & Design: Lead the design and implementation of Data Lakehouse architectures, including batch and real-time streaming pipelines using Delta Lake.
-
Cloud Data Platforms: Build and optimize scalable data engineering solutions leveraging Azure Data Factory, Databricks, ADLS, Azure Event Hub, PySpark/Scala/SQL.
-
Data Governance & Cataloging: Drive the adoption and implementation of Unity Catalog, ensuring robust data lineage, governance, schema, and access control mechanisms.
-
Performance & Cost Optimization: Review solutions and guide teams on performance tuning, cloud cost optimization, and architectural best practices.
-
Technical Leadership: Act as a trusted advisor to engineering teams, providing hands-on guidance in data engineering, analytics, and cloud-native design patterns.
-
Innovation & Strategy: Evaluate new tools, frameworks, and approaches to continuously improve the enterprise data ecosystem.
Key Skills & Experience
-
12+ years in Data Engineering & Analytics with strong focus on cloud-based platforms.
-
Proven expertise in Azure Cloud Services, Databricks, Delta Lake, and Lakehouse architectures.
-
Hands-on experience with PySpark, Scala, and SQL for building large-scale data pipelines.
-
Strong knowledge of data governance, security frameworks, and data cataloging tools (Unity Catalog preferred).
-
Experience with streaming technologies (Azure Event Hub, Kafka, or similar).
-
Strong understanding of data pipeline frameworks, distributed computing, and performance optimization.
-
Excellent communication, leadership, and stakeholder management skills.