Added: 2025-05-28 13:26.00
Updated: 2025-05-30 03:36.25

Data Science Data Science Python Software Engineer (Remote)

Wroclaw , Lower Silesian Voivodeship, Poland

Type: n/a

Category: IT & Internet & Media

Advertisement
Requirements: English
Company: Revolut
Region: Wroclaw , Lower Silesian Voivodeship

More visibility, more control, and more freedom. So far, we have 10,000+ people working around the world, from our offices and remotely, to help us achieve our mission. Our Technology team isnt just one of the best in the industry. It's one of the best in the world. And were proud of it. Its our driving force our engine. From building a new financial backend to creating an innovative app, theres nothing they cant do. Our Technology team isnt here to fix legacy systems its here to build world-class financial features from the ground up that'll be used by millions of people around the world. Data platform engineers are the enablers of this data-centric culture, providing the infrastructure and tools that power insight generation and decision-making for everyone in the company, from entry-level analysts to C-level executives. We're looking for a talented and passionate engineer who's an exceptional builder and reliable collaborator to manage our exponential growth in data and related complexity. Designing, building, and maintaining an efficient and reliable data platform, streamlining end-to-end processes and automating workflows Partnering with cross-functional teams (Product, Engineering, Analytics) to build and enhance a seamless data platform, translating abstract concepts into practical solutions Planning and executing organisation-wide platform changes, ensuring consistent best practices for coding, testing, deployment, and maintenance Leveraging data to guide all aspects of engineering work, ensuring insight-driven outcomes A bachelor's or master's degree in computer science or related field, or equivalent practical experience Proficiency in Python, SQL, and Unix Shell scripting Demonstrated experience in custom ETL design, implementation, and maintenance, along with workflow orchestration using tools like Airflow Trino, Spark, Snowflake, BigQuery)Experience building data platforms using Spark, Trino, Presto, Flink, or similar, with a focus on data quality, SQL performance tuning, and data warehousing principles Expertise in cloud (GCP, AWS), containerisation, and infrastructure-as-code (Docker, Kubernetes, Terraform)Familiarity with notebook-based data science workflows and proficiency in using monitoring and logging tools (NewRelic, Grafana, Prometheus, ELK)We're not just doing this because it's the right thing to do. Were doing it because we know that seeking out diverse talent and creating an inclusive workplace is the way to create exceptional, innovative products and services for our customers.
Advertisement
Click here to apply and get more details about this job!
It will open in a new tab.
Terms and Conditions - Webmaster - Privacy Policy