Job Openings
Data Engineer (Madrid, hybrid)
About the job Data Engineer (Madrid, hybrid)
Job Title: Data Engineer
Location: Madrid (Hybrid 2 days/week in the office)
The company is a consulting firm headquartered in Belgium and also has offices in Spain and Madrid. It focuses on digital transformation, business innovation, and customer experience. The company works with organizations to optimize operations, leverage data and AI, and design improved customer journeys. Its approach is human-centric, combining creativity with data-driven insights to deliver practical and impactful change.
Responsibilities
- SQL Database Management: Design, implement, and maintain SQL databases to ensure efficient data storage and retrieval.
- Optimize and tune SQL queries for maximum performance.
- Elasticsearch Integration: Work with Elasticsearch to index, search, and analyze large volumes of data efficiently.
- Collaborate with cross-functional teams to integrate Elasticsearch into our data ecosystem.
- Azure Data Factory Pipelines: Develop and manage data pipelines using Azure Data Factory.
- ETL Development and Maintenance: Design, develop, and maintain ETL processes to extract, transform, and load data from various sources into our data warehouse.
- Implement data quality checks and ensure the integrity of data throughout the ETL process.
- Ensure the reliability, scalability, and efficiency of data movement within the Azure cloud environment.
- Work in migrating our data sources into Snowflake
- Proactive Problem Solving: Proactively identify and address data-related issues, ensuring data accuracy and consistency.
- Collaborate with other teams to understand their data requirements and provide effective solutions.
- Clearly communicate complex technical concepts to non-technical stakeholders.
- Collaborate with data scientists, analysts, and other team members to understand data needs and deliver solutions.
- Documentation: Maintain thorough documentation for all data engineering processes, ensuring knowledge transfer and best practices.
Qualifications
- Bachelors degree in Computer Science, Information Technology, or related field.
- Proven experience in SQL database design and optimization.
- Experience with github or gitlab.
- Experience with dbt.
- Strong ETL development skills.
- Experience with data modeling.
- Hands-on experience with Snowflake or Databricks
- Proficiency in creating and managing data pipelines using Azure Data Factory.
- Excellent problem-solving and analytical skills.
- Proactive mindset with the ability to work independently and collaboratively.
- Strong communication and interpersonal skills.
Preferred Skills
- Familiarity with other cloud platforms (AWS, GCP).
- Experience with big data technologies.
- Knowledge of data warehousing concepts.
- Certifications in relevant technologies.