We are looking for a Lead Data DevOps to join EPAM and contribute to a project for a large customer in the e-commerce/fashion industry.
As a Senior Data DevOps in Data Platform, you will focus on maintaining and implementing new features to the data transformation architecture, which is the backbone of the Customer's analytical data platform. By constantly challenging the status quo, you will deliver high-performance data processing solutions that are efficient and reliable at scale.
This position offers remote setup with the flexibility to work from any location in Kazakhstan, whether it's your home or well-equipped offices in Astana, Almaty or Karaganda.
#LI-DNI#Big-Data-GE-4#Big-Data-GE-5#January-Referral-Digest-GE#April-Referral-Digest-KZ#wca-data-devops-aws-databricks#Big-Data-6-KZ#May-Referral-Digest-KZ
Want more jobs like this?
Get jobs in Zhezqazghan, Kazakhstan delivered to your inbox every week.
Responsibilities
- Troubleshooting user issues related to Databricks (Spark) and suggesting optimizations for long-running or resource-intensive jobs
- Guiding users on best practices of Databricks cluster management
- Consult and help our diverse teams to develop, implement and maintain sustainable, high-performance, growth-ready data-processing and data integration systems
- Implementing services that improve the CI/CD experience of users by emphasizing self-service
- Working together with other teams in Customer's Data Infrastructure to deliver services that serve as a backbone for Customer's central datalake
- Improving data access methods to provide bulletproof, secure and by default compliant self-service platform
- Guiding users and supporting them to find cost-effective setups while keeping high efficiency
- Implementing observability solutions in Databricks to reduce slack costs
- Driving users to adopt edge Databricks features like Photon and Graviton instances
- Reviewing current infrastructure to find cost-saving opportunities
- Analyzing, designing and developing solutions in an Agile team to support essential business needs for the platform search application
- Proficient knowledge in working with distributed data processing frameworks like Apache Spark and a good understanding of relational database management systems
- Strong hands-on production experience with Databricks
- Practical knowledge of Python and Scala combined with SQL knowledge
- Hands-on production experience in cloud technologies (AWS services like IAM, S3, Lambda, EC2)
- Engineering craftsmanship with an expertise in software development processes focusing on testing, continuous integration/continuous delivery (CI/CD), monitoring and writing documentation
- Advanced English skills
- AWS Networking or Basics of Networking on AWS cloud
- Hands-on experience with Terraform
- We connect like-minded people: :
- Delivering innovative solutions to industry leaders, making a global impact
- Enjoyable working environment, whether it is the vibrant office or the comfort of your home
- Opportunity to work abroad for up to two months per year
- Relocation opportunities within our offices in 55+ countries
- Corporate and social events
- We invest in your growth: :
- Leadership development, career advising, soft skills and well-being programs
- Certifications, including GCP, Azure and AWS
- Unlimited access to LinkedIn Learning, Get Abstract, O'Reilly, Cloud Guru
- Free English classes with certified teachers
- Discounts in local language schools, including online courses for the Kazakh language
- We cover it all: :
- Participation in the Employee Stock Purchase Plan
- Monetary bonuses for engaging in the referral program
- Comprehensive medical & family care package
- Six trust days per year (sick leave without a medical certificate)
- Coverage of psychology sessions of your choice
- Benefits package (sports activities, a variety of stores and services)