EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.
We are seeking a Snowflake Data Engineer to enhance our robust data solutions team.
The ideal candidate will design and maintain effective data structures, optimize data storage and retrieval in Snowflake, and ensure data integrity among various sources. This role includes working with cross-functional teams to deliver top-notch data solutions for analytical and operational needs.
Want more jobs like this?
Get jobs in Chennai, India delivered to your inbox every week.
#LI-DNI#EasyApply
Responsibilities
- Designing and implementing scalable Snowflake data models optimized for ingestion and analytics
- Building and maintaining robust ETL pipelines integrating data from multiple sources into Snowflake, ensuring data consistency
- Tuning Snowflake usage and storage, optimizing query performance, and managing data partitions for fast reliable data access
- Enforcing best practices for data security, role-based access, and data masking in Snowflake to comply with governance standards
- Utilizing tools such as dbt and Apache Airflow for scheduling data processes and automating pipeline control
- Collaborating with data scientists, business analysts, and stakeholders to solve complex data challenges and troubleshoot issues related to Snowflake
- Creating comprehensive documentation on data structures, ETL workflows, and system processes for team transparency and knowledge sharing
- 3 to 5 years of experience in data engineering or similar fields
- Proficiency in Python
- Background in AWS as the primary cloud service
- Expertise in Snowflake modeling and data modeling using Data Vault, as well as in the DBT framework for data transformation pipelines
- Familiarity with workflow management tools including Argo, Oozie and Apache Airflow
- Understanding of developing a scalable, secure, high-performance data warehouse with integration capabilities for monitoring and observability tools
- Opportunity to work on technical challenges that may impact across geographies
- Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications
- Opportunity to share your ideas on international platforms
- Sponsored Tech Talks & Hackathons
- Unlimited access to LinkedIn learning solutions
- Possibility to relocate to any EPAM office for short and long-term projects
- Focused individual development
- Benefit package:
- Health benefits
- Retirement benefits
- Paid time off
- Flexible benefits
- Forums to explore beyond work passion (CSR, photography, painting, sports, etc.)