Requirements: English
Company: Limango Polska
Region: Krakw , Lesser Poland Voivodeship
As limango we create a company which has been specializing in e-commerce for 14 years. Together with platforms in the Netherlands and Germany, we are part of the OTTO Group, one of Europe''s leading e-commerce companies. We are the shopping platform with the largest selection of products for the whole family! We work and play together. Join the limango IT!
In the limango IT you get the chance to contribute your own ideas and know-how to maintain and develop our highly frequented, self built online shop for our markets in Poland, Germany, Austria and the Netherlands
As a Data Engineer, you will play a key role in one of the companys most strategic initiatives led by the Data Platform Team (DPT): migrating legacy data solutions to a modern, cloud-based environment and building a new enterprise data warehouse based on the Data Vault methodology.
Contribute to the development of a centralized Lakehouse Data Platform built on AWS and Databricks.
Manage and evolve data infrastructure (Terraform on AWS / Databricks) and ETL pipelines (PySpark, Delta Live Tables, SparkSQL).
Implement monitoring, data quality testing, unit tests, and automated alerting within the platform.
Refactor legacy AWS solutions (e.g., Glue, Redshift) into a modern, CI/CD-deployed Lakehouse environment with proper observability and data quality controls.
Actively support the design and implementation of a medallion lakehouse architecture to support ML and analytics use cases, incorporating data mesh principles.
Bachelor''s or Masters degree in Computer Science, Information Systems, Engineering, Mathematics, or a related field.
~~2 years of hands-on experience in building modern data engineering solutions (experience in E-commerce is a plus).
~ Good proficiency in PySpark and a solid understanding of the Spark processing engine architecture (required).
~ Proven experience with Python for building applications, automated testing, and deployment.
~Advanced SQL skills for working with structured and semi-structured data.
~Familiarity with data lake architecture and data modeling concepts.
~Proficiency in CI/CD tools and Git-based development workflows.
~Fluency in English (our team works in a fully international environment).
~ Team player mindset and eagerness to learn and adopt new technologies and frameworks
Experience with AWS ecosystem and the Databricks platform is highly welcomed
Hands-on experience with AWS Glue, Redshift, or other AWS-native data services
Exciting challenges and a huge influence on our project and business youll be responsible for bringing our new projects live,
A steep learning curve in a dynamic company with international orientation,
Flexible working hours and either B2B
Possibility to develop your skills, thanks to trainings and cooperation with international experts.
Private health care
English and German lessons in small groups, tailored to your skills.
Remote work and flexible working hours
Possibility of partial remote work, as well as adjusting working hours to your daily schedule.
There is no shortage of coffee, fruit, pizza, sweets and healthy snacks in our office.