We are looking for a highly skilled Senior Data Software Engineer to join our team and lead efforts in building and optimizing data integration and processing pipelines that power Enterprise Data Products. This role focuses on a transformative project to replace Azure Synapse Data Warehouse with a modern Databricks-based architecture, introducing Dremio for semantic interfaces and applying domain-driven data modeling.
#LI-DNI
Responsibilities
- Develop efficient and high-performance data pipelines within Databricks Medallion architecture using DLT
- Integrate and implement custom Data Quality components following provided designs
- Conduct data modeling and domain-driven design to support the enterprise's new data platform and products
- Build performant ETL pipelines to transform and process large datasets
- Create and maintain semantic models in Dremio in alignment with guided approaches
- Optimize data systems to ensure scalability and reliability in a production environment
- Collaborate with cross-functional teams to rationalize existing PowerBI reports and repoint them to restructured data products on the new platform
- Troubleshoot, debug, and refine the functionality of data integration processes
- Ensure proper documentation of workflows, processes, and data transformations
- Actively contribute to advancing the data engineering practices and tooling across the organization
Want more jobs like this?
Get Data and Analytics jobs in Villa Altagracia, Dominican Republic delivered to your inbox every week.
- 3+ years of experience in software engineering or data engineering roles
- Proficiency in Azure Databricks, including DLT and Unity Catalog
- Competency in PySpark or a combination of Spark experience with strong Python fundamentals
- Background in Azure Synapse analytics
- Understanding of designing and building data pipelines using modern cloud architecture
- Skills in data modeling and domain-driven design principles
- Familiarity with Dremio or similar semantic layer tools
- Showcase of building performant ETL pipelines for large-scale data systems
- Capability to work collaboratively within a multi-disciplinary team
- Strong problem-solving skills and ability to deliver high-quality solutions under tight deadlines
- Excellent command of written and spoken English (B2+ level)
- Expertise in Dremio, focusing on creating and enhancing semantic interfaces
- Flexibility to use PowerBI for report rationalization and repointing data products
- Knowledge of custom Data Quality frameworks or equivalent data validation tools
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn