We are looking for a Senior Data Engineer to make a team even stronger.
#LI-DNI
Responsibilities
- Develop, monitor, and operate the most used and most critical curated data pipeline - Sales Order Data (incl. Post-order information, e.g. shipment, return, payment). This pipeline is processing hundreds of millions of records to provide high-quality datasets for analytical and machine learning use-cases
- Consulting with analysts, data scientists, and product managers to build and continuously improve "Single Source of Truth" KPI for business steering such as the central Profit Contribution measurement (PC II)
- Leverage and improve a cloud-based tech stack that includes AWS(Azure), Databricks, Kubernetes, Spark, Airflow, Python, and Scala
Want more jobs like this?
Get jobs in Bratislava, Slovakia delivered to your inbox every week.
- Expertise in Apache Spark along with Spark streaming & Spark SQL
- Good hands on experience with Databricks and delta-lake
- Fluency in Scala programming language
- Good understanding & hands-on experience with CI/CD
- Rich working experience with Github
- Fluency working with any cloud (AWS, Azure, GCP) landscape
- Ability to build Apache Airflow pipelines
- Opportunity to work in a fast-paced, agile, software engineering culture
- Benefit program (5 weeks of vacation, 5 paid sick days, meal vouchers, cafeteria and recreation bonuses, reimbursement of glasses, contribution to pension fund)
- Referral bonuses for recommended candidates
- English language courses
- Great learning and development opportunities, including in-house professional training, career advisory and coaching, sponsored professional certifications, well-being programs, LinkedIn Learning Solutions and much more