Requirements: English
Company: Motife Sp. z o.o.
Region: Warsaw , Masovian Voivodeship
technologies-expected :
- PySpark
- HIVE
- Hadoop
- PL/SQL
- AWS
- Snowflake
- Apache Spark
about-project :
- Our client is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. The company has 90,000+ employees across the globe.
- We are looking for a Data Engineer to design and optimize data solutions that power decision-making across global business units. Youll work hands-on with technologies like PySpark, Hadoop, and Hive SQL to process large-scale datasets, while also playing a key role in ensuring system performance, data quality, and production stability. This role blends technical implementation with cross-regional collaboration, operational support, and continuous improvement of data infrastructure.
- Stack: Apache Spark, Hadoop Ecosystem, Python, SparkSQL
responsibilities :
- Implement and configure PySpark, Hadoop, and Hive SQL solutions in production environments, working with large-scale datasets.
- Engage with stakeholders across EMEA, NAM, and APAC regions to address incidents, coordinate fixes, and ensure the timely resolution of production issues.
- Collaborate with BAU teams and the global production assurance team to maintain system stability, performance, and adherence to SLAs.
- Provide technical guidance and support to offshore teams, particularly in PySpark and Hadoop environments, including troubleshooting and issue resolution.
- Utilize Autosys for job scheduling, monitoring, and automation of workflows.
- Work closely with regional EMEA tech teams to ensure compliance with data protection regulations and best practices in data handling.
requirements-expected :
- Professional experience in Big Data, PySpark, HIVE, Hadoop, PL/SQL.
- Good knowledge of AWS and Snowflake.
- Good understanding of CI/CD and system design.
- Possession of the Databricks Certified Developer Apache Spark 3.0 certification is a mandatory requirement for this position.
- Excellent written and oral communication skills in English.
- Ability to understand and work on various internal systems.
- Ability to work with multiple stakeholders.
- Experience working on technologies in Fund transfer, AML knowledge will be an added advantage.
- Bachelor''s or Masters degree in computer science engineering or a related field.
- Nice to have: experience with Starburst Presto.