Job Openings Data Architect with GCP

About the job Data Architect with GCP

Introduction

You will be part of the Digital Analytics Team, Reporting to the Digital Analytics Manager, and your primary focus will be supporting the Business and as a data architect you are going to be part of a team of visionaries who translate business requirements into technology requirements and define data standards, practices and principles.

The data architect is responsible for partnering with the data Architects and work collaboratively with the business and design the data management framework for various business projects

The framework describes the processes used to plan, specify, enable, create, acquire, maintain, use, archive, retrieve, control, and purge data. The data architect also provides the business a standard common business vocabulary, expressing strategic requirements, outlining high-level integrated designs to meet those requirements, and aligns with the broader enterprise strategy and related business architecture.

Role and Responsibilities

Data Architects are a key part of digital transformation.

The focus of this role would be on the planning and designing of a data different frameworks, master data management MDM) and pipelines with a main emphasis on the concepts of efficiently designing and implementing on our cloud-based Enterprise Data Lake.

Working hand to hand with the Enterprise/Solution Architects and Data Engineers to create/implement plans to improve existing data models and structure the tables with the design pattern that will help the system for improved performance.

Generally, a data architect has and in-depth knowledge and expertise of database technologies, along with solid programming, design, and system analysis skills, but is not required to be successful. Willingness to learn new technologies that can help improve the current system.

The Work You May Do

· Interact with key team leaders and managers to understand their data requirements and play a key role in driving the data design forward with the data engineers.

· Translation of business requirements and business operation practices into the existing technology data framework

· Manage the meta data for various enterprise data pipelines

· Define and design pipelines and processes for cleaning and organizing data as well as build tools to help make the data accessible by data scientists and the business.

· Leverage Google Cloud Platform and similar tools.

· Understand and support a corporate data model and overall data governance

· Communicate with application, back-office and external customer teams regarding data requirements, standards, performance and access

· Define and perform unit testing of database code as appropriate.

· Perform code reviews and audits of application teams database code to ensure compliance with established best practices.

· Utilize big data analytics tools and apply techniques for data retrieval, preparation and loading of data from a wide variety of data sources using various data engineering tools and methods.

· Build integrations (API, etc.) with on-premise and cloud databases

· You have a strong understanding of programming languages extensively used in Data Science applications (e.g., R, Python, Scala, Java, etc.)

· Debug, troubleshoot, design and implement solutions to complex technical issues.

· Build capabilities with software tools (e.g. Google Cloud Platform)

· Work in an Agile, collaborative environment, partnering with other scientists, Architects, and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors.

· Designing and implementing data solutions for operational and secure integration across systems.

· Demonstrating solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code.

· Improving operations by conducting systems analysis to understand process and application bottlenecks; recommending changes in policies and procedures.

· Leverage Agile, CI/CD and DevOps methodologies to deliver high quality product on-time

Who You Are

· You have a knack for transforming data and migrating data in cloud platforms leveraging cloud native tools.

· Ask questions and love to learn

· Desire to get a detailed knowledge of the way does business today, and its plans for changes in practices and goals, either leading or supporting.

· You have experience building high-capacity data analytics systems using cloud platforms and services for storing, collecting, and analyzing data. (Google Cloud preferred.)

· Love to tinker and understand the why.

· You have the ability to deal with ambiguity and the flexibility to change direction as additional information becomes available.

· Understand the value and purpose of good project management.

Required Professional and Technical Expertise

· Bachelors degree in in computer science, software engineering or a closely related field; a Master's degree is preferred

· 2 + years of experience developing batch and streaming ETL processes (preferably Talend)

· 2 + years of experience with relational and NoSQL databases, including modeling and writing complex queries

· 2 years of development experience leveraging an Agile methodology with experience collaborating on coding projects in Google Cloud Platform

· 2+ years of industry experience with modern programming languages like Python (preferred), Java, etc.

· 1+ years of industry experience with graph databases like Neo4j or TigerGraph (preferred)

· Exposer to columnar cloud databases preferably BigQuery.

· Exposure to relational database platforms like Microsoft, Oracle, etc.

· Progressive mindset particularly around deployment models and emerging technologies

· 2 + years of experience in Cloud Engineering, experience in a similar role with demonstrated relationship building skills resulting in traceable, measurable, impactful results

· experience working in team environments and implementing organizational change.