We are seeking a highly skilled Senior Data Software Engineer to join our team and spearhead efforts in creating and optimizing data integration and processing pipelines that power Enterprise Data Products. This position focuses on a transformative project to transition from Azure Synapse Data Warehouse to a modern Databricks-based architecture, incorporating Dremio for semantic interfaces and implementing domain-driven data modeling.
#LI-DNI
Responsibilities
- Develop efficient and high-performance data pipelines within Databricks Medallion architecture using DLT
- Integrate and implement custom Data Quality components based on provided designs
- Conduct data modeling and apply domain-driven design to enhance the enterprise's new data platform and products
- Build performant ETL pipelines to transform and process large datasets
- Create and maintain semantic models in Dremio with detailed guidance
- Optimize data systems to ensure scalability and reliability in a production environment
- Collaborate with cross-functional teams to rationalize existing PowerBI reports and align them with rebuilt data products on the new platform
- Troubleshoot, debug, and enhance the functionality of data integration processes
- Ensure proper documentation of workflows, processes, and data transformations
- Actively contribute to the improvement of data engineering practices and tooling across the organization
Want more jobs like this?
Get Data and Analytics jobs in Villa Altagracia, Dominican Republic delivered to your inbox every week.
- 3+ years of experience in software engineering or data engineering roles
- Proficiency in Azure Databricks, including DLT and Unity Catalog
- Competency in PySpark or a combination of Spark experience with strong Python fundamentals
- Background in Azure Synapse analytics
- Understanding of designing and building data pipelines using modern cloud architecture
- Skills in data modeling and domain-driven design principles
- Familiarity with Dremio or similar semantic layer tools
- Showcase of building performant ETL pipelines for large-scale data systems
- Capability to work collaboratively within a multi-disciplinary team
- Strong problem-solving skills and an ability to deliver high-quality solutions under tight deadlines
- Excellent command of written and spoken English (B2+ level)
- Proficiency in Dremio, including creating and optimizing semantic interfaces
- Background in working with PowerBI report rationalization and aligning them to updated data products
- Knowledge of custom Data Quality frameworks or similar data validation tools
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn