We are seeking an experienced Senior Data Platform Engineer with a profound expertise in Databricks and multi-cloud environments to join EPAM.
In this role, you'll engineer scalable, high-performance data solutions that unlock the full potential of data for our Fortune 1000 clients. If you thrive in building data platforms from scratch and driving innovations across cloud platforms, this opportunity at EPAM could be perfect for you.
#LI-DNI
Responsibilities
- Architect and deploy robust, scalable, and secure data platforms using Databricks, ensuring optimal performance and security
- Create cloud-agnostic solutions across AWS, Azure, and GCP to ensure flexibility and system resilience
- Design and implement end-to-end data pipelines integrating data lakes, warehouses, and streaming frameworks
- Leverage Databricks SQL, Delta Lake, MLflow, and advanced Spark optimization techniques for data interaction and performance enhancements
- Collaborate with cross-functional teams to implement and maintain workflows based on Databricks following best practices
- Develop and maintain CI/CD pipelines specifically tailored for seamless data platform deployment and testing
- Set up and manage monitoring, logging, and alerting frameworks to ensure infrastructure health and operational excellence
- Optimize compute and storage resources to achieve cost-efficiency without compromising on performance
- Troubleshoot and resolve issues related to Databricks and Spark performance
- Mentor team members on cluster management, job optimization, and resource allocation within Databricks environments
- Ensure adherence to compliance standards and maintain platform security
- Drive adoption of advanced capabilities in Databricks like Photon and Graviton instances for improved efficiency
- Regularly update and refine existing architectures to meet changing business and technology needs
Want more jobs like this?
Get jobs in Madrid, Spain delivered to your inbox every week.
- Extensive experience with Databricks, Apache Spark, and distributed data processing systems
- Strong programming skills in Python, Scala, and SQL
- Proficiency in AWS (specifically S3, IAM, Lambda), Azure, or GCP focusing on data engineering services
- Expertise in data architecture principles, including data lakes, lakehouses, and ETL workflows
- Hands-on experience with CI/CD tools and infrastructure as code practices (Terraform, CloudFormation preferred)
- Familiarity with monitoring and observability frameworks suitable for large-scale data environments
- Strong analytical and problem-solving skills
- Excellent communication and teamwork abilities
- Ability to self-manage and operate effectively in a dynamic environment
- Certifications in Databricks, AWS, Azure, or GCP
- Knowledge of Kubernetes and containerized deployments for data pipelines
- Experience with real-time data streaming frameworks and governance tools
- Private health insurance
- EPAM Employees Stock Purchase Plan
- 100% paid sick leave
- Referral Program
- Professional certification
- Language courses
- Why Join EPAM
- WORK AND LIFE BALANCE. Enjoy more of your personal time with flexible work options, 24 working days of annual leave and paid time off for numerous public holidays.
- CONTINUOUS LEARNING CULTURE. Craft your personal Career Development Plan to align with your learning objectives. Take advantage of internal training, mentorship, sponsored certifications and LinkedIn courses.
- CLEAR AND DIFFERENT CAREER PATHS. Grow in engineering or managerial direction to become a People Manager, in-depth technical specialist, Solution Architect, or Project/Delivery Manager.
- STRONG PROFESSIONAL COMMUNITY. Join a global EPAM community of highly skilled experts and connect with them to solve challenges, exchange ideas, share expertise and make friends.