Added: 2025-05-15 16:31.00
Updated: 2025-05-21 03:09.07

Data Engineering Specialist

Budapest , Budapest, Hungary

Type: n/a

Category: IT & Internet & Media

Advertisement
Requirements: English
Company: Sanofi EU
Region: Budapest , Budapest

Location: Budapest

About the job Sanofis Digital Data organizations mission is to transform Sanofi into a data-first and AI first organization by empowering everyone with good data. Through custom developed products built on world class data foundations and platforms, the team builds value and a unique competitive advantage that scales across our markets, R&D and manufacturing sites. The team is located in major hubs in Paris, Lyon, Budapest, Barcelona, Cambridge, Bridgewater and most recently, Toronto. Join a dynamic, fast paced and talented team, with world class mentorship, using AI and ML to chase the miracle of science.

In this role you will lead Data Engineering technical development to develop best-in-class cloud-first solutions to support our data and AI teams.

Main responsibilities: Establish technical designs to meet business and technical requirements as a prime data engineering lead

Develop and maintain all components of data engineering solutions and pipelines based on requirements and design specifications using appropriate tools and technologies

Optimize solutions (ETL / data pipelines) to balance performance, functionality, cost, and other operational requirements

Conduct peer reviews for quality, consistency, and rigor for production level solution

Create design and development documentation based on standards for knowledge transfer, training, and maintenance

Define and promote best practices and standards for code management, automated testing, and deployments

Work with business and products teams to understand and decompose high level business requirements, and translate them into technical needs

Leverage existing or create new standard data pipelines within Sanofi to bring value through business use cases

Develop automated tests for CI/CD pipelines

Test and validate developed solution to ensure it meets requirements and design specifications

Mentor and train junior team members

Assess and fix data pipeline issues to ensure performance and timeliness of execution

Lead data engineering efforts to modernize and replace legacy solutions and platforms

Partner with Data Engineering Manager, Product Owner and Data Analysts to create implementation plan, including assessments of timelines and complexity

Gather/organize large & complex data assets, and perform relevant analysis

Actively contribute to Data Engineering community and define leading practices and frameworks

Communicate results and findings in a clear, structured manner to stakeholders at various levels

Remain up to date on the companys standards, industry practices and emerging technologies

About you

Key Functional Requirements & Qualifications: Experience working cross-functional teams to solve complex data architecture and engineering problems

Demonstrated ability to learn new data and software engineering technologies in short amount of time

Experience with agile/scrum development processes and concepts

Able to work in a fast-paced, constantly evolving environment and manage multiple priorities

Demonstrated technical analysis and problem-solving skills related to data and technology solutions

Excellent written, verbal, and interpersonal skills with ability to communicate ideas, concepts and solutions to stakeholders at all levels

Pragmatic and capable of solving complex issues, with technical intuition and attention to detail

Service-oriented, flexible, and approachable team player

Fluent in English (Other languages a plus)

Key Technical Requirements & Qualifications: Bachelors Degree or equivalent in Computer Science, Engineering, or relevant field

5+ years of experience in data engineering, integration, data warehousing, business intelligence, business analytics, or comparable role with relevant technologies and tools (Spark, Informatica/IICS or equivalent)

Experience in Big Data integration, including cloud-based platforms such as AWS, and Snowflake

Proficiency in data structures and algorithms

Demonstrated experience in scripting languages(Python, SQL, Shell scripting)

Experience with job scheduling and orchestration (Control-M and Airflow is a plus)

Strong proficiency in relevant technologies such as databases (relational and NoSQL), distributed systems, and microservices

Experience working with large data sets, andperformance and query optimization

Why choose us?

An international work environment, in which you can develop your talent and realize ideas and innovations within a competent team.

Exposure to training and certification pathways for technical and functional skills from AWS, SnowFlake, Informatica and others.

Your own career path within Sa
Advertisement
Click here to apply and get more details about this job!
It will open in a new tab.
Terms and Conditions - Webmaster - Privacy Policy