Skills:
GCP, Hive, SQL, PySpark, Python
This job needs strong expertise on building ETL pipelines using Pyspark, Would need strong expertise on Python programming language, Expertise on optimizing Pyspark code to transform large volumes of data with minimal resources, Expertise on working big data environments.
Experience building, deploying, and supporting data ingestion and batch applications on Google Cloud using capabilities like like BigQuery, Cloud Storage, Dataproc, Cloud Composer/Airflow.
Along with above below are expected needs from candidat:
1. Expected to come to Visa office if there are any activities which demands to be available in office.
2. Good to have SQL, PySpark, Hive
3. Amex Lumi experience is a significant plus
Want more jobs like this?
Get jobs in Gurgaon, India delivered to your inbox every week.
4. Expected to perform production support activities when required.
PySpark