We are seeking an experienced Senior Data Software Engineer to join our team and take the lead in developing and refining data integration and processing pipelines that drive Enterprise Data Products. This position focuses on a transformative initiative to transition from Azure Synapse Data Warehouse to a modern Databricks-based architecture, integrate Dremio for semantic interfaces, and adopt domain-driven data modeling practices.
#LI-DNI
Responsibilities
- Design efficient and high-performance data pipelines within Databricks Medallion architecture using DLT
- Implement and incorporate custom Data Quality components aligned with provided designs
- Apply data modeling and domain-driven design principles to support the new enterprise data platform and products
- Build high-performance ETL pipelines to transform and process large-scale datasets
- Develop and manage semantic models in Dremio, adhering to detailed instructions
- Enhance data systems to ensure scalability and reliability in production environments
- Collaborate with cross-functional teams to streamline existing PowerBI reports and integrate them into rebuilt data products on the platform
- Diagnose, debug, and improve the functionality of data integration processes
- Document workflows, processes, and data transformations comprehensively
- Contribute actively to the advancement of data engineering practices and tools across the organization
Want more jobs like this?
Get Data and Analytics jobs in Villa Altagracia, Dominican Republic delivered to your inbox every week.
- 3+ years of software engineering or data engineering experience
- Proficiency in Azure Databricks, including DLT and Unity Catalog
- Competency in PySpark or Spark alongside strong Python fundamentals
- Background in Azure Synapse analytics
- Understanding of building and designing data pipelines with modern cloud architecture
- Skills in data modeling and domain-driven design concepts
- Familiarity with Dremio or comparable semantic layer tools
- Showcase of creating performant ETL pipelines for large-scale data platforms
- Capability to collaborate effectively within multidisciplinary teams
- Strong analytical skills with the ability to produce high-quality solutions under tight deadlines
- Excellent English communication skills (B2+ level)
- Expertise in Dremio for developing and optimizing semantic interfaces
- Flexibility to collaborate on PowerBI report rationalization and integration with data products
- Knowledge of custom Data Quality frameworks or comparable validation tools
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn