Requirements: English
Company: Haybury
Region: Madrid , Community of Madrid
Are you passionate about transforming raw data into actionable insights? Join a dynamic, global team and help shape the future of data-driven decision-making.
Key Responsibilities: Build and maintain data transformation logic using dbt and Snowflake Write clean, efficient SQL for data cleaning, normalization, and enrichment Ensure data quality through testing and observability practices Manage data pipelines with Apache Airflow Collaborate across data product teams, IT, and business stakeholders Contribute to CI/CD workflows and documentation
Job requirements: 3+ years in data engineering or a related field Hands-on experience with dbt, Snowflake, and Airflow Strong SQL skills and understanding of data quality frameworks Familiarity with Bitbucket, Git, and CI/CD practices Excellent communication skills in English
Bonus Points For: Experience with pharmaceutical datasets Knowledge of Data Vault, AWS, Jira, and Confluence
Ready to make an impact? Apply now or email a copy of your CV!