About the job Senior Data Solutions Architect (AWS/Python/Snowflake) - Remote Portugal
ABOUT THE OPPORTUNITY
Join a world-class technology consultancy where software is built by people, for people, creating high-performance systems that impact users worldwide. We're seeking a Data Solutions Architect to join an agile, collaborative team where your voice matters as much as your code. You'll be a technical partner building resilient data systems that scale, working closely with leadership (CEO/COO) and clients where clear communication and collaboration are key. We value empathy, self-organization, and the courage to take risks.
PROJECT & CONTEXT
You'll own the lifecycle of data pipelines and storage systems across cloud environments, ensuring they are scalable, secure, and high-performing. The role involves working closely with developers, partners, and leadership to deliver robust data solutions and integrate best practices into daily workflows. You'll build software services and APIs using Python frameworks to create smooth, reliable data processing and deployment experiences. Applying appropriate data modeling methodologies (Kimball, Inmon, Data Vault) to solve different problems, you'll optimize the stack for every challenge. Working with streaming data platforms like Kafka or Kinesis and proprietary open-source platforms including Snowflake, Databricks, Vertica, Spark, and Airflow, you'll enable teams through automation and continuous improvement. Expected travel varies by project (0-15%).
WHAT WE'RE LOOKING FOR (Required)
- 6+ years of experience in data architecture and engineering roles
- Cloud platform expertise: Solid experience working in cloud environments with strong exposure to AWS, plus familiarity with Azure and GCP
- Streaming data experience: Hands-on with streaming platforms like Apache Kafka or AWS Kinesis
- Data platform proficiency: Experience with proprietary open-source platforms including Snowflake, Databricks, Vertica, Apache Spark, and Apache Airflow
- Deep Python expertise: Strong programming skills including organizing Python-based projects and building software services and APIs
- Data modeling mastery: Strong understanding of different data modeling methodologies including Kimball dimensional modeling, Inmon enterprise data warehouse, and Data Vault 2.0
- Pipeline ownership: Ability to own complete lifecycle of data pipelines and storage systems
- Automation focus: Experience building automated data processing and deployment solutions
- Fast-paced environment: Comfortable tackling complex problems across different technologies
- Collaboration skills: Excellent collaborator working seamlessly between developers, partners, and leadership
- Communication excellence: Ability to explain data modeling techniques, architectures, and technical concepts to various stakeholders
- Growth mindset: Self-organization with courage to take risks and belief that teaching others is the best way to learn
- Language requirement: Fluent English (mandatory for client and leadership communication)
NICE TO HAVE (Preferred)
- Experience with additional data platforms and technologies
- Background in real-time analytics and event-driven architectures
- Familiarity with data governance and security frameworks
- Knowledge of Infrastructure as Code (Terraform, CloudFormation)
- Experience mentoring data engineers and technical teams
- Understanding of DataOps practices and methodologies