As one of the world's leading digital transformation service providers, we are looking to actively expand our Data Practice across the UK to meet increasing client demand for our services. We are hiring for a Lead Data Engineer with a specific focus on Capital Markets Risk.
The ideal candidate will have extensive experience designing, developing and optimizing data pipelines and infrastructure within the capital markets risk domain. This role requires expertise in risk analytics, data engineering best practices and cloud-based solutions.
#LI-DNI
Responsibilities
- Design, develop and optimize data pipelines and infrastructure for risk analytics in capital markets
- Implement and manage workflows, DAGs and tasks in Apache Airflow, ensuring adherence to best practices
- Deploy and manage cloud database services, with a preference for Snowflake
- Utilize SQL and Python to manipulate and analyze data effectively
- Develop and maintain risk technology solutions, including VaR, Stress Testing, Sensitivities and P&L Vectors
- Implement cloud-based solutions using modern cloud technologies, preferably AWS
- Ensure high data quality standards through validation and governance processes
- Collaborate with cross-functional teams to deliver robust data solutions
- Utilize modern SDLC tooling such as Git, Bamboo, Jira or similar
- Troubleshoot and resolve complex data-related issues efficiently
Want more jobs like this?
Get jobs in London, United Kingdom delivered to your inbox every week.
- Experience in risk analytics, including market risk or credit risk
- Strong software development skills in Python or another common programming language (e.g., Java or Scala)
- Deep expertise in Apache Airflow, with demonstrated experience designing and managing workflows
- Proficiency in deploying and managing cloud database services, particularly Snowflake
- Advanced skills in SQL and Python for data manipulation and analysis
- Hands-on experience implementing cloud-based solutions with AWS or similar cloud platforms
- Familiarity with modern SDLC tooling, including Git, Bamboo, Jira or similar
- Proven ability to implement and enforce data quality standards and best practices
- Excellent communication skills with the ability to collaborate effectively within a team environment
- Strong understanding of derivatives, pricing and risk management for structured products, options and exotic derivatives
- Knowledge of additional data processing libraries and tools to enhance data engineering workflows
- Expertise in real-time data processing frameworks such as Apache Flink or Kafka Streams
- Experience building event-driven and/or streaming data services
- EPAM Employee Stock Purchase Plan (ESPP)
- Protection benefits including life assurance, income protection and critical illness cover
- Private medical insurance and dental care
- Employee Assistance Program
- Competitive group pension plan
- Cyclescheme, Techscheme and season ticket loans
- Various perks such as free Wednesday lunch in-office, on-site massages and regular social events
- Learning and development opportunities including in-house training and coaching, professional certifications, over 22,000 courses on LinkedIn Learning Solutions and much more
- If otherwise eligible, participation in the discretionary annual bonus program
- If otherwise eligible and hired into a qualifying level, participation in the discretionary Long-Term Incentive (LTI) Program
- *All benefits and perks are subject to certain eligibility requirements