The Data Engineer Lead will join our data engineering capability in ELC's Eastern European Center, located in Bucharest. Helping ELC to become a data-driven organization, the role is responsible for managing the Data Ingestion Service as well as maintaining and evolving the Data Ingestion Framework software.
Working within a virtual global team, the profile will have the opportunity to work on analytical projects across the entire enterprise. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer Lead will support our global team on data initiatives and will ensure consistency and adherence to the standards and guidelines of the platforms.
Want more jobs like this?
Get Software Engineering jobs in Bucharest, Romania delivered to your inbox every week.
Must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our data architecture to support our next generation of products and data initiatives.
Main Responsabilities:
- Manage delivery across multiple concurrent data ingestion projects
- Manage a capacity plan for the Data Ingestion Service. Liaise with global team to address constraints and adapt resources and/or stakeholder expectations.
- Onboard and train developers in the use of the Data Ingestion Framework software
- Manage the Data Ingestion Service budget. Record Actuals, track Actuals vs Plan, liaise with global team to request additional sources of funding as needed.
- Ability to adjust prioritization according to business needs.
- Design, build and deploy optimized, sustainable data pipelines and data products leveraging ELC best practices and development frameworks
- Maintain and evolve the data ingestion framework and collaborate with the other team members to make architectural changes
- Define and evolve best practices, standards and guidelines; assist in rolling out Continuous Integrations and Delivery methodology.
- Coach data engineers on best practices and technical concepts behind large-scale data platform
- Create functional & technical documentation - e.g., ETL architecture documentation, unit testing plans and results, data integration specifications, etc.
Qualifications
Technical Skills and Qualifications:
Mandatory:
- Strong SQL skills
- Experience in designing and building data solutions in Azure Databricks, ensuring scalability, performance, and reliability.
- Deep understanding and hands-on experience with Azure data services such as Azure SQL Database, Azure Data Lake Storage, Azure Blob Storage, etc.
- Expertise in designing and implementing ETL processes using tools like Azure Data Factory or Azure Databricks
- PySpark; coding in distributed computing environments, Databricks preferred.
Nice to have:
- Programming experience, preferably Python
- Experience with orchestration tools like Airflow, Astronomer or Cloud Composer
- Data/Analytics Project experience involving SAP data
- Data/Analytics Project experience in CPG or Retail; Beauty industry a plus.
- Project experience that demonstrates strong analytic skills to understand business problems and initiative to solve problem statements with insights derived from data
Job: Information Technology
Primary Location: RO-B-Bucharest
Job Type: Standard
Schedule: Full-time
Shift: 1st (Day) Shift
Job Number: 2318563