Design, develop, and implement ETL pipelines for the continuous deployment and integration.
Collaborate with other data engineers and data scientists to understand data requirements and optimize data solution processes.
Assure that data is cleansed, mapped, transformed, and otherwise optimised for storage and use according to business and technical requirements
The ability to automate tasks and deploy production standard code (with unit testing, continuous integration, versioning etc.)
Load transformed data into storage and reporting structures in destinations including data warehouse, high speed indexes, real-time reporting systems and analytics applications
Build data pipelines to collectively bring data together
Other responsibilities include extracting data, troubleshooting and maintaining the data warehouse
Implement best practices for pipeline building and governance.
Troubleshoot and resolve issues related to pipeline deployment and performance.
Ensure compliance with security and data privacy standards
Solution design using Microsoft Azure services and other tools
Want more jobs like this?
Get jobs in Bangalore, India delivered to your inbox every week.