Added: 2025-05-20 13:47.00
Updated: 2025-05-25 03:41.05

Data Engineer with Python @ GFT Poland

Ldz , Ldz Voivodeship, Poland

Type: n/a

Category: IT & Internet & Media

Advertisement
Requirements: English
Company: GFT Poland
Region: Ldz , Ldz Voivodeship

Why join GFT?

You will work with and learn from top IT experts. You will join a crew of experienced engineers: 60% of our employees are senior level.

Interested in the cloud? You will enjoy our full support in developing your skills: training programs, certifications and our internal community of experts. We have strong partnerships with top cloud providers: Google, Amazon and Microsoft - we are number one in Poland in numbers of GCP certificates. Apart from GCP, you can also develop in AWS or Azure.

We are focused on development and knowledge sharing. Internal expert communities provide a comfortable environment where you can develop your skillset in areas such as blockchain, Big Data, cloud computing or artificial intelligence.

You will work in a stable company (32 years on the market) in demanding and challenging projects for the biggest financial institutions in the world.

We offer you:


Nice to have Skills:

Why join GFT?

You will work with and learn from top IT experts. You will join a crew of experienced engineers: 60% of our employees are senior level.

Interested in the cloud? You will enjoy our full support in developing your skills: training programs, certifications and our internal community of experts. We have strong partnerships with top cloud providers: Google, Amazon and Microsoft - we are number one in Poland in numbers of GCP certificates. Apart from GCP, you can also develop in AWS or Azure.

We are focused on development and knowledge sharing. Internal expert communities provide a comfortable environment where you can develop your skillset in areas such as blockchain, Big Data, cloud computing or artificial intelligence.

You will work in a stable company (32 years on the market) in demanding and challenging projects for the biggest financial institutions in the world.

We offer you:

,[We are looking for professionals at various levels from Mid, through Senior, to Expert to join our team. Your responsibilities will include performance tuning and optimization of existing solutions, building and maintaining ETL pipelines, as well as testing and documenting current data flows. You will also be involved in implementing tools and processes to support data-related projects and promoting the best development standards across the team.] Requirements: Python, SQL, PySpark, Hadoop, Hive, Data engineering, GCP, Data management, Git, GitHub, Jenkins, Jira, PostgreSQL, BigQuery, Apache Beam, Spark, GitHub Actions, Control-M, Airflow, Scala, Bash, Continuous integration Additionally: Home office, Knowledge sharing, Life insurance, Sport subscription, Training budget, Private healthcare, International projects, Integration events, Free coffee, Playroom, Free snacks, Free beverages, In-house trainings, In-house hack days, Modern office, Free fruits.
Advertisement
Click here to apply and get more details about this job!
It will open in a new tab.
Terms and Conditions - Webmaster - Privacy Policy