EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.
We are seeking an experienced Lead Snowflake Data Engineer to spearhead our data warehousing solutions, with a focus on creating scalable and secure architectures using Snowflake.
Want more jobs like this?
Get jobs in Hyderabad, India delivered to your inbox every week.
The ideal candidate will drive development, optimization, and governance of our Snowflake environment to support intricate data ingestion, storage, and analytics requirements.
#LI-DNI#EasyApply
Responsibilities
- Lead the design and implementation of scalable Snowflake data models to meet data ingestion and analytics needs
- Develop and maintain robust ETL pipelines, ensuring data consistency and integrity across multiple sources
- Optimize Snowflake configurations for performance enhancements, query tuning, and efficient data partition management
- Implement data security measures, role-based access control, and data masking within Snowflake to adhere to compliance and governance standards
- Automate data processing and pipeline monitoring using dbt and Apache Airflow
- Collaborate with cross-functional teams to solve complex data challenges and effectively troubleshoot Snowflake-related issues
- Produce thorough documentation and reports on data structures, ETL workflows, and system processes for team transparency and knowledge sharing
- 8 to 12 years of experience in data engineering
- Proficiency in Python programming
- Background in Snowflake modeling, including roles, schema, and databases
- Expertise in data modeling using Data Vault methodology
- Competency in design and development of data transformation pipelines with the DBT framework
- Skills in workflow management using tools such as Argo, Oozie, or Apache Airflow
- Familiarity with AWS as a cloud service provider
- Understanding of how to build scalable, secure, and highly performant warehousing solutions that integrate with monitoring and observability features
- Opportunity to work on technical challenges that may impact across geographies
- Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications
- Opportunity to share your ideas on international platforms
- Sponsored Tech Talks & Hackathons
- Unlimited access to LinkedIn learning solutions
- Possibility to relocate to any EPAM office for short and long-term projects
- Focused individual development
- Benefit package:
- Health benefits
- Retirement benefits
- Paid time off
- Flexible benefits
- Forums to explore beyond work passion (CSR, photography, painting, sports, etc.)