Job Description:
About Us
At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day.
One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We're devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. sssss
Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization.
Want more jobs like this?
Get Software Engineering jobs in Chennai, India delivered to your inbox every week.
Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us!
Global Business Services
Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations.
Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation.
In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services.
Process Overview
As a Hadoop Platform Engineer in Data Analytics Platform team, individual will be responsible for understanding design, propose high level and detailed design solutions, propose out of box technical solutions for resolving business problems and technical problems arising during the real time in production. Individual should be flexible to work on the weekends at least once in a month. May get comp-off during weekdays. Should be open to work in rotational shift. As an individual contributor in BAU, person should have good analytical skills to take a quick decision during the tough times. Engage in discussions with information architecture team for coming out with design solutions, proposing new technology adoption ideas, attending project meetings, partnering with near shore and offshore teammates in a matrix environment, coordinating with other support teams like L2, development, testing, upstream and downstream partners, etc.
Job Description
The Data analytics Platform client services team is seeking a candidate who is proficient in Cloudera Hadoop ecosystem and its components (HDFS, YARN, HIVE,Tez, Impala, Spark, MapReduce, HBase) , Ozone and Private Cloud Components (CDW, CDW and CML) . The Data Analytics Platform Client Services team member will be responsible for providing technical and administrative support for Hadoop, Ozone, Linux, Cloud and HBase platforms in a fast-paced operations in environment supporting business critical applications using Hadoop components and Cloud components. The Analyst should have good problem-solving, strong, and advanced troubleshooting of challenging and complex problem on Hadoop, Cloud and Linux related issues for the Hadoop clients. Good knowledge and experience in Unix and Python scripting to develop platform monitoring, application management and CM API and Rest API capabilities and capacity management development tools.
Client Service role is a tenant facing role in providing efficient support.
Responsibilities
• Proven understanding and knowledge with Cloudera Hadoop, YARN, Hive, Tez, IMPALA, Apache Spark, Ozone, Private Cloud Data Services, HBase, Nifi, security Ranger, Ozone, Hive Metadata.
• Administer, troubleshoot, perform problem isolation, and correct problems discovered in clusters
• Performance tuning of Hadoop clusters and ecosystem components and jobs. This includes the management and review of Hadoop log files and identify root cause and provide solution.
• Troubleshoot platform problems and connectivity issues. Diagnose and address application and database performance issues using performance monitors and various tuning techniques.
• Interact with Storage and Systems administrators on Linux/Unix/VM operating systems and Hadoop Ecosystems
• Capabilities to automate and manual tasks, create alerts and platform problems tools
• Platform Cluster Capacity Management with tenant Storage, Compute management . Tenant storage monitoring and alerting and reporting.
• Document programming problems and resolutions for future reference.
Requirements
Education: B.E. / B. Tech/M.E. /M. Tech
Experience Range: 10 to14 years.
Foundational Skills
- Hadoop, Kafka, Spark, Impala, Hive, Hbase etc.
- Strong knowledge of Hadoop Architecture, HDFS, Hadoop Cluster and Hadoop Administrator's role
- Strong technical knowledge: Unix/Linux; Database (Sybase/SQL/Oracle), Java, Python, Perl, Shell scripting, Infrastructure
Desired Skills
- Experience working on Big Data Technologies
- Cloudera Admin / Dev Certification
- Certification in Cloud, Docker-Container, Openshift Technologies
- Experience in Monitoring & Alerting, and Job Scheduling Systems
- Knowledge of automation / DevOps tools - Ansible, Jenkins, SVN, Bitbucket
Work Timings: 11:00 AM to 8:00 PM
Job Location: Chennai