Added: 2025-05-27 13:26.00
Updated: 2025-05-30 03:36.36

Data Engineer @ VirtusLab

Wrocaw , Lower Silesian Voivodeship, Poland

Type: n/a

Category: IT & Internet & Media

Advertisement
Requirements: English
Company: VirtusLab
Region: Wrocaw , Lower Silesian Voivodeship

We are #VLteam tech enthusiasts constantly striving for growth. The team is our foundation, thats why we care the most about the friendly atmosphere, a lot of self-development opportunities, and good working conditions. Trust and autonomy are two essential qualities that drive our performance. We simply believe in the idea of measuring outcomes, not hours. Join us see for yourself!

Project scope

We are building a modern Data and Integration Platform for a fast-scaling Insurance client. Our work consolidates fragmented legacy systems, organizes data from 200+ sources, and creates a standardized, future-proof cloud-native environment. We aim to unlock the full value of the company''s data, enable more informed and faster decision-making, and provide the backbone for business growth, integration, and AI readiness. This includes setting up a transparent, role-based, governed data environment and engineering a robust, scalable integration hub to connect internal systems and third-party services

Tech stack

SQL, Data modelling, Data Quality, Python, Azure, Apache Iceberg, Trino, Airflow, dbt, DevOps, IaC

Challenges

We focus on delivering a trusted, high-quality, and well-governed data platform to replace a highly fragmented and immature technology landscape. The key challenges include consolidating over 200 legacy systems into a streamlined, standardized technology stack and designing and implementing a modern cloud-native data platform leveraging Azure, Starburst with Iceberg, Airflow, and the Power Platform suite. We are also building an integration layer and API hub to support third-party data ingestion, such as sanctions checks, foreign exchange, and entity validation. Another primary task is phasing out outdated tooling like KNIME and replacing it with maintainable, scalable workflows. Embedding strong DevOps practices, including Infrastructure as Code (IaC), automated testing, and CI/CD pipelines, is critical to the platform delivery. Additionally, we aim to enable tactical business outcomes, such as early data marts and reporting capabilities, while building towards a complete platform. Enhancing the developer experience, ensuring operational excellence, and embedding strong data governance with role-based access control are fundamental. All initiatives are entirely cloud-native, designed with automation, traceability, scalability, and business agility

Team

We aim to build a small, agile, cross-functional team capable of delivering the complete data and integration project, from initial architecture to production operations. The team will be flexible and multidisciplinary to foster strong ownership, collaboration, and rapid delivery. It will work closely with the client''s CTO and business stakeholders to ensure technical excellence, effective knowledge transfer, and alignment with strategic goals



Dont worry if you dont meet all the requirements. What matters most is your passion and willingness to develop. Moreover, B2B does not have to be the only form of cooperation.Apply and find out!

We are #VLteam tech enthusiasts constantly striving for growth. The team is our foundation, thats why we care the most about the friendly atmosphere, a lot of self-development opportunities, and good working conditions. Trust and autonomy are two essential qualities that drive our performance. We simply believe in the idea of measuring outcomes, not hours. Join us see for yourself!

Project scope

We are building a modern Data and Integration Platform for a fast-scaling Insurance client. Our work consolidates fragmented legacy systems, organizes data from 200+ sources, and creates a standardized, future-proof cloud-native environment. We aim to unlock the full value of the company''s data, enable more informed and faster decision-making, and provide the backbone for business growth, integration, and AI readiness. This includes setting up a transparent, role-based, governed data environment and engineering a robust, scalable integration hub to connect internal systems and th

Advertisement
Click here to apply and get more details about this job!
It will open in a new tab.
Terms and Conditions - Webmaster - Privacy Policy