We are seeking a highly skilled Senior Data Software Engineer to join our team and spearhead efforts in creating and optimizing data integration and processing pipelines that power Enterprise Data Products. This position focuses on a transformative project to transition from Azure Synapse Data Warehouse to a modern Databricks-based architecture, incorporating Dremio for semantic interfaces and implementing domain-driven data modeling.
#LI-DNI
Responsibilities
- Develop efficient and high-performance data pipelines within Databricks Medallion architecture using DLT
- Integrate and implement custom Data Quality components based on provided designs
- Conduct data modeling and apply domain-driven design to enhance the enterprise's new data platform and products
- Build performant ETL pipelines to transform and process large datasets
- Create and maintain semantic models in Dremio with detailed guidance
- Optimize data systems to ensure scalability and reliability in a production environment
- Collaborate with cross-functional teams to rationalize existing PowerBI reports and align them with rebuilt data products on the new platform
- Troubleshoot, debug, and enhance the functionality of data integration processes
- Ensure proper documentation of workflows, processes, and data transformations
- Actively contribute to the improvement of data engineering practices and tooling across the organization
Want more jobs like this?
Get Data and Analytics jobs in San Javier, Chile delivered to your inbox every week.
- 3+ years of experience in software engineering or data engineering roles
- Proficiency in Azure Databricks, including DLT and Unity Catalog
- Competency in PySpark or a combination of Spark experience with strong Python fundamentals
- Background in Azure Synapse analytics
- Understanding of designing and building data pipelines using modern cloud architecture
- Skills in data modeling and domain-driven design principles
- Familiarity with Dremio or similar semantic layer tools
- Showcase of building performant ETL pipelines for large-scale data systems
- Capability to work collaboratively within a multi-disciplinary team
- Strong problem-solving skills and an ability to deliver high-quality solutions under tight deadlines
- Excellent command of written and spoken English (B2+ level)
- Proficiency in Dremio, including creating and optimizing semantic interfaces
- Background in working with PowerBI report rationalization and aligning them to updated data products
- Knowledge of custom Data Quality frameworks or similar data validation tools
- Improved medical coverage - EPAMers are eligible to participate in a supplementary health insurance program that shall have the usual coverage in the industry, with the Company funding 100% of the value of the monthly premium for participation
- Lunch Allowance - You will receive a daily allowance of CLP $ 7,000 per working day. Enjoy a nice meal on us
- Allowance for internet and electricity - You will receive an allowance of CLP $15,000 per month to cover internet and electricity expense
- National Holiday Bonus - We celebrate joining the Chilean Market. That is why all our employees will receive a bonus of CLP $86,646 in September
- Christmas Bonus - You will receive an End of Year bonus of CLP $170,539. It will be paid during the month of December, to ensure you have a Happy Holiday!
- Learning Culture - We want you to be the best version of yourself, that is why we offer unlimited access to learning platforms, a wide range of internal courses, and all the knowledge you need to grow professionally
- Additional Income - Besides your regular salary, you will also have the chance to earn extra income by referring talent, being a technical interviewer, and many more ways
- Are you open to relocation? - If you want to relocate to another country and we have the right project, we will assist you every step of the way, to help you and your family, reach your new home