Skip to main contentA logo with &quat;the muse&quat; in dark blue text.

Data Engineer

AT IBM
IBM

Data Engineer

Bangalore, India

Introduction
At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, lets talk.

Your Role and Responsibilities
Data Engineer in IBM's CIO organization, supporting a data warehouse that consolidates IBM's global real estate data into IBM's Cognitive Enterprise Data Platform (CEDP), including integrations with enterprise systems (TRIRIGA, Maximo, Envizi, and others). Data includes comprehensive information about IBM's global internal real estate portfolio, including properties, space, leases, energy consumption, construction/renovation projects, and environmental compliance.

Want more jobs like this?

Get Data and Analytics jobs in Bangalore, India delivered to your inbox every week.

By signing up, you agree to our Terms of Service & Privacy Policy.


Responsibilities:
  • Manage integrations for data ingestion from multiple source systems via APIs, queries, and Apache Spark and Airflow workflows. Launch Spark jobs in Airflow as needed.
  • Support data transformations and aggregations in a Cloud Object Storage (COS) integration zone, and subsequent data feeds to a DB2 warehouse.
  • Maintain a Cirrus platform hosting Cloud Object Storage, along with inbound & outbound data flows. Initiate activities such as opening firewall flows, defining entitlements, and managing roles & user access as needed. Identify and perform other activities required to ensure reliable operation of the cluster,
  • Identify and promptly address issues with data and integrations. Implement and optimize monitoring, and troubleshoot errors through in-depth reviews of logs, code, data, and integration components.
  • Document and share data architectures and flows (including schemas, tables, queries, and scheduled activities) with data analysts and Cognos developers.
  • Develop new capabilities for data ingestion and transformation as needed. Create algorithms, develop new transfers using Spark/Airflow, APIs, SQL, etc. Perform comprehensive testing of individual components as well as end-to-end solution.

Required Technical and Professional Expertise

  • 5+ years of experience with managing data warehouses and integration solutions
  • API integration experience
  • Apache Spark and Airflow experience
  • SQL development, relational database table structure, and database design
  • Expertise in working with structured and unstructured data
  • Excellent communication skills (written and verbal). Able to clearly communicate with leadership and colleagues about technical capabilities, limitations, issues, and recommendations.
  • Highly organized, detail oriented, independent, and resourceful.
  • Able to manage complex technical projects with diverse global stakeholders and detailed, interdependent requirements
  • Experience with Agile practices and associated tools, including Jira

Preferred Technical and Professional Expertise

  • Cognos Analytics experience, including creating/maintaining data modules and developing reports & dashboards
  • Javascript development experience
  • BS or BA in a related discipline

Client-provided location(s): Bengaluru, Karnataka, India
Job ID: IBM-20974583
Employment Type: Full Time

Company Videos

Hear directly from employees about what it is like to work at IBM.