Skip to main contentA logo with &quat;the muse&quat; in dark blue text.

Developer

AT Wipro
Wipro

Developer

Bangalore, India

GCP:Data engineer

  • Hands on and deep experience working with Google Data Products (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.).
  • Hands on experience in SQL and Unix scripting
  • Experience in Python and Kafka.
  • ELT Tool Experience and Hands on DBT
  • Google Cloud Professional Data Engineers are responsible for developing Extract, Transform, and Load (ETL) processes to move data from various sources into the Google Cloud Platform.

Detailed JD :

Must Have

Around 8 to 11 years of experience with a strong knowledge in migrating on premise ETLs to Google Cloud Platform (GCP) ?

Want more jobs like this?

Get Data and Analytics jobs in Bangalore, India delivered to your inbox every week.

By signing up, you agree to our Terms of Service & Privacy Policy.


2-3 years of Strong Bigquery+GCP Experience.

Very Strong SQL writing skills

Hand on Experience in Python Programming

Hands on experience in Design, Development, Implementation of Data Warehousing in ETL process.

Experience in IT data analytics projects, hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as BIG query, Google Cloud Storage, Composer, Dataflow, Cloud Functions. GCP certified Associate Cloud Engineer.

Practical understanding of the Data modelling (Dimensional & Relational), Performance Tuning and debugging.

Extensive experience in the Data Warehousing using Data Extraction, Data Transformation, Data Loading and business intelligence technologies using ELT design

Working experience in CI /CD using Gitlab and Jenkins.

Good to Have

DBT tool experience

Practical experience in Big Data application development involving various data processing techniques for Data Ingestion, Data Modelling In-Stream data processing and Batch Analytics using various distributions of Hadoop and its ecosystem tools like HDFS, HIVE, PIG, Sqoop, Spark.

Document all the work implemented using Confluence and track all requests and changes using Jira.

Involved in both technical and managerial activities and experience in
GCP Responsibilities ? Create and maintain optimal data pipeline architecture. ? Assemble large, complex data sets that meet functional / non-functional business requirements. ? Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. ? Build and maintain the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and GCP data warehousing technologies. ? Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. ? Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. ? Keep our data separated and secure across national boundaries through data centers and GCP regions. ? Work with data and analytics experts to strive for greater functionality in the data systems. Qualifications ? Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. ? Experience building and optimizing data warehousing data pipelines (ELT and ETL), architectures and data sets. ? Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. ? Strong analytic skills related to working with unstructured datasets. ? Build processes supporting data transformation, data structures, metadata, dependency and workload management. ? A successful history of manipulating, processing and extracting value from large disconnected datasets. ? Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores. ? Strong project management and organizational skills. ? Experience supporting and working with cross-functional teams in a dynamic environment. Technical Skillset ? Experience with data warehouse tools: GCP BigQuery, GCP BigData, Dataflow, Teradata, etc. ? Experience with relational SQL and NoSQL databases, including PostgreSQL and MongoDB. ? Experience with data pipeline and workflow management tools: Data Build Tool (DBT), Airflow, Google Cloud Composer, Google Cloud PubSub, etc. ? Experience with GCP cloud services: Primarily BigQuery, Kubernetes, Cloud Function, Cloud Composer, PubSub etc. ? Experience with object-oriented/object function scripting languages: Python, Java, Terraform etc. ? Experience with CICD pipeline and workflow management tools: GitHub Enterprise, Cloud Build, Codefresh etc. ? Experience with Data Analytics and Visualization Tools: Tableau BI Tool (OnPrem and SaaS), Data Analytics Workbench (DAW), Visual Data Studio etc. ? GCP Data Engineer certification is mandatory Cloud-PaaS-GCP-Google Cloud Platform

Client-provided location(s): Bengaluru, Karnataka, India
Job ID: Wipro-3107158
Employment Type: Full Time