About the job (Senior) Manager Data Architect Artificial Intelligence & Data 19824
About Virtido
Virtido is an entrepreneurial and innovative IT and business process outsourcing company headquartered in Zurich, Switzerland. We realize ideas and projects from strategic conception to technical implementation in close cooperation with our dynamic clients.
About the client and the job
We work together with our partner in Switzerland and support him in finding the best candidate for the clients. The client is one of the leading audit and advisory firms in Switzerland and worldwide.
We are looking for a (Senior) Manager Data Architect Artificial Intelligence & Data to work on-site in our customers' office in Zurich.
Responsibilities:
- As a Data Architect in our AI & Data team, you will help senior people at our clients to make the right architectural choices and lead the design and
implementation of innovative enterprise-wide solutions. - Your will be able to make significant impact in the following areas:
- Lead Enterprise Data Architecture design: help clients modernize complex system landscapes spanning multiple solutions (e.g., MDM, CRM, ERP, Cloud DWH) that need to be integrated. You will work with senior client stakeholders to design enterprise level architecture, pragmatic and enforceable
architecture standards, and plan with them the modernization journey. - Lead Cloud Data Platform design & implementation: help clients build scalable, high-performance Cloud data platforms that meet the business needs.
- Your work will cover data modelling, data integration, transformation pipelines, DevOps, and use of AI to build and operate these platforms smartly. You
will make key design decisions and lead onshore, near-/offshore engineering teams developing the solution. - Architect Integrated Solutions: help clients design and build integrations required to run their business processes and deliver data products. This may
include integration between Cloud data platform and AI/Gen AI platform or between MDM or CRM systems and Cloud data platform. - Strengthen our market positioning and opportunities funnel: lead our architecture footprint in the market through events, conferences, and discussions
with our key clients, internal teams, and alliance partners. - Develop our internal know-how and expertise: mentor junior team members within AI & Data and set strategic focus for expertise building within
Switzerland and in relation to near- and offshore teams.
Requirements:
Experience
- You have 4 or more years of relevant professional experience in the role of a Data Architect and have led teams that designed and implemented complex
cloud-based ecosystems. - You have engineering background and least one hands-on data platform end-to-end implementation experience as a data engineer in the past (AWS,
Azure, Google Cloud, Databricks, Snowflake)You have experience working with evaluating and/or taking data architecture decisions, influencing strategic
direction and convincing business and IT stakeholders. - You have experience in guiding data engineers with best practices in coding styles and coding quality and are familiar with industrial coding quality tools
such as SonarQube as well as AI coding assistants integrated into IDE. - You have experience across a range of database types, including structured (relational), semi-structured (NoSQL), and graph-based (knowledge graph)
models. - Nice to have: you have experience in packaging data platform solutions into Infrastructure as Code using solutions such as Terraform.
Drive and Motivation
- Focused individuals looking to grow their careers on the path of becoming Senior Data Architect within AI & Data consulting.
- Keen interest and ability to stay on top of technology and architectural trends.
- Ability to manage multiple activities in a high-paced environment and context switch without losing focus or big picture.
Leadership, commercial acumen, and communication skills:
- Strong client-facing presence and ability to connect with different senior stakeholders, build relationships, and assess, expand and convert opportunities
into projects. - Strong abstract reasoning skills to grasp enterprise level, solution level, and operational concepts and how they link together to form a solution.
- Strong communications skills both oral and written and ability to clearly articulate point of view and resonate with technical and business stakeholders.
- Strong team leadership skills with ability to partner with externals, estimate and plan the work, and coach team members to deliver optimal client
outcomes.
Technical Expertise:
- Strong skills in modelling methods with experience or certifications from DAMA, TOGAF or equivalent: Conceptual, logical, and physical data models; Dimensional modelling; Data Vault.
- Design of normalized and denormalized schemas for analytical and operational workloads and understanding of modelling toolkit.
- Hands-on experience in designing and implementing solutions on major cloud platforms: Amazon Web Services (AWS), Google Cloud Platform (GCP), Azure.
- Expertise in architecting modern data platforms (Lakehouse, Medallion, Lambda, event-driven) in Snowflake and/or Databricks, or equivalent.
- Design E2E architectures that balance performance, cost, and maintainability.
- Estimate implementation effort and lead delivery teams.
- Create architecture blueprints and solution design documents.
- Use of GenAI capabilities to optimize delivery and operation.
- Data Integration & ETL.
- Design integration layer architecture from multiple sources (ERP, CRM, operational databases, APIs, files).
- Nice to have: Familiarity with ETL/ELT such as Informatica or DBT.
Metadata and event-driven architecture:
- Expertise in metadata and event-driven architecture with the ability to articulate best practices for capturing metadata within a data pipeline.
- Ability to leverage metadata to develop event-driven logic within the pipeline for monitoring and alerts.
- Enterprise Architectures with strong focus of integrating MDM solutions with data platforms and extending data platform with AI Platform components.
Programming frameworks and data formats:
- Ability to review engineering work done in SQL, Python, Spark.
- Familiarity with relational and NoSQL databases, knowledge graphs, and big data file formats (Parquet, Iceberg).
Language Proficiency:
- Fluent in English; fluency in German and/or French is highly preferred.
What we offer
- Development and training opportunities.
- Comfortable and friendly work environment and proactive constantly developing team.
- Additional benefits may apply.
Does this resonate well with you? Then we look forward to receiving your application!