About the role:
The Enterprise Document Delivery team is seeking a Sr ETL Software Engineer who will participate in the entire system development lifecycle of applications related to document generation processing, printing, electronic and postal delivery for the bank. The team supports high volume statement applications residing on different platforms. The position will work with various Lines of Business across the organization to understand their requirements and architecture to design and develop the required solution. The candidate should have a strong knowledge SDLC process as well as Agile experience , and ensure that all phases of our development are 100% Wells Fargo Technology SDLC compliant. This position requires someone that can be flexible, wants to be part of a dynamic team, and is able to manage multiple priorities and tasks simultaneously.
Want more jobs like this?
Get jobs in Hyderabad, India delivered to your inbox every week.
Responsibilities of the role include the following:
- ETL Design and Development: Develop, maintain, and optimize ETL processes for data ingestion, transformation, and data warehousing across multiple platforms, including both SQL and NoSQL databases.
- Data Pipelines: Design, build, and manage scalable data pipelines using technologies like Databricks, Apache Spark, Python, SQL, and NoSQL databases.
- NoSQL/MongoDB Expertise: Work with MongoDB to design efficient document schemas, implement query optimization, and handle large-scale unstructured data.
- Data Integration: Collaborate with cross-functional teams to ensure seamless data integration between different sources such as databases (both relational and NoSQL), APIs, and external files.
- Performance Optimization: Implement and monitor performance metrics, optimize data processing performance, and manage ETL job scheduling and dependencies.
- Data Quality: Ensure data quality and integrity across ETL pipelines, implementing processes for data validation, cleansing, and enrichment.
- Automation: Automate repeatable ETL tasks and data processing workflows to improve efficiency and accuracy.
- Collaboration: Work closely with data architects, analysts, and business stakeholders to gather and understand data requirements.
- Cloud Platforms: Leverage cloud services (Azure, GCP) for data storage, processing, and infrastructure management, ensuring scalability and cost efficiency.
- Best Practices: Maintain documentation, adhere to data governance, and best practices in data management, including security and compliance.
- Build Microservices APIS to expose ETL services .
ESSENTIAL QUALIFICATIONS
- Experience: Minimum of 5+ years of experience as a Data Engineer or ETL Developer in complex, large-scale data environments.
- SSIS, Databricks Expertise: Strong hands-on experience working with SSIS, Databricks, including using Apache Spark for data processing and optimization.
- ETL Tools: Proficient with various ETL tools and frameworks such as Informatica, Talend, or SSIS or DataBricks
- Big Data Technologies: In-depth knowledge of big data processing frameworks like Spark, Hadoop, Kafka, etc.
- NoSQL/MongoDB: Expertise in working with NoSQL databases, especially MongoDB, for large-scale data storage, retrieval, and optimization.
- Programming Skills: Proficient in SQL, Python or Java, Power shell for building data pipelines.
- SQL and NoSQL Proficiency: Strong knowledge of SQL and experience working with both relational databases and NoSQL databases like MongoDB.
- Data Modeling: Expertise in designing and implementing data models, including OLAP, OLTP, dimensional, and document-based models (NoSQL).
- Data warehousing: Data warehousing is a key part of ETL process, as it stores data from multiple sources in an organized manner and will be needed to build a repeatable ETL workflow to support many different data sources.
- Redesign and refactor legacy custom ETL processes to reusable ETL workflows that can ingest diverse data sources to normalize data to standard JSON / XML output.
- Data Governance: Knowledge of data governance, security standards, and best practices for managing sensitive data.
- Version Control: Experience with Git or other version control systems for code management.
- Certification: Databricks certification, MongoDB certification, or other relevant certifications in data engineering, cloud platforms, or big data technologies.
- Soft Skills: Strong problem-solving skills, excellent communication, and the ability to work in a collaborative team environment.
- Analytical Mindset: Ability to translate business requirements into scalable, efficient, and reliable ETL solutions.
- Solid understanding of legacy communication protocols and migration strategies.
- Experience with cloud platforms like AWS, Azure, or Google Cloud or TKGI
Project Details:
- Team(s) using SSIS to transform mainframe formatted files to standard JSON and XML file format
- Converting applications from hosted platform to distributed cloud hosted environment
- Evaluating and reengineering customed ETL workflows to reusable ETL microservice API
- Target migration from SSIS to Databricks
Role Purpose
The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations
Do
- Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup)
- Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution
- Conduct technology capacity planning by reviewing the current and future requirements
- Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable
- Strategize & implement disaster recovery plans and create and implement backup and recovery plans
- Manage the day-to-day operations of the tower
- Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues.
- Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower
- Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges
- Develop shift roster for the team to ensure no disruption in the tower
- Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc.
- Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps
- Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness
- Team Management
- Resourcing
- Forecast talent requirements as per the current and future business needs
- Hire adequate and right resources for the team
- Resourcing
- Train direct reportees to make right recruitment and selection decisions
- Talent Management
- Ensure 100% compliance to Wipro's standards of adequate onboarding and training for team members to enhance capability & effectiveness
- Build an internal talent pool of HiPos and ensure their career progression within the organization
- Promote diversity in leadership positions
- Performance Management
- Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports.
- Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below
- Employee Satisfaction and Engagement
- Lead and drive engagement initiatives for the team
- Track team satisfaction scores and identify initiatives to build engagement within the team
- Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team
- Exercise employee recognition and appreciation
Stakeholder Interaction
Stakeholder Type
Stakeholder Identification
Purpose of Interaction
Internal
Technology Solutions Group, BU Teams, Different Infrastructure teams
Understanding requirements, planning and status updates, maintenance and back up, issue resolution etc.
IRMC, QA
Guidance on risk mitigation and quality standards
External
Clients
Understanding requirements, planning and status updates, maintenance and back up, issue resolution etc.
Vendors/ Manufacturers
Development and deployment of platforms, applications, databases etc.
Display
Lists the competencies required to perform this role effectively:
- Functional Competencies/ Skill
- Technical Knowledge - Knowledge of own tower (platform, application, database etc) - Expert
- Domain Knowledge - Understanding of IT industry and its trends - Competent to Expert
Competency Levels
Foundation
Knowledgeable about the competency requirements. Demonstrates (in parts) frequently with minimal support and guidance.
Competent
Consistently demonstrates the full range of the competency without guidance. Extends the competency to difficult and unknown situations as well.
Expert
Applies the competency in all situations and is serves as a guide to others as well.
Master
Coaches others and builds organizational capability in the competency area. Serves as a key resource for that competency and is recognised within the entire organization.
- Behavioral Competencies
- Managing Complexity
- Client centricity
- Execution Excellence
- Passion for Results
- Team Management
- Stakeholder Management
Deliver
No.
Performance Parameter
Measure
1.
Operations of the tower
SLA adherence
Knowledge management
CSAT/ Customer Experience
Identification of risk issues and mitigation plans
Knowledge management
2.
New projects
Timely delivery
Avoid unauthorised changes
No formal escalations
If you encounter any suspicious mail, advertisements, or persons who offer jobs at Wipro, please email us at helpdesk.recruitment@wipro.com. Do not email your resume to this ID as it is not monitored for resumes and career applications.
Any complaints or concerns regarding unethical/unfair hiring practices should be directed to our Ombuds Group at ombuds.person@wipro.com.
We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, caste, creed, religion, gender, marital status, age, ethnic and national origin, gender identity, gender expression, sexual orientation, political orientation, disability status, protected veteran status, or any other characteristic protected by law.
Wipro is committed to creating an accessible, supportive, and inclusive workplace. Reasonable accommodation will be provided to all applicants including persons with disabilities, throughout the recruitment and selection process. Accommodations must be communicated in advance of the application, where possible, and will be reviewed on an individual basis. Wipro provides equal opportunities to all and values diversity.