Please verify your account first! Send OTP

Job Overview

Functional Area

Data

Work preferred

Work from Office

Experience

Min Experience

5 Years

Max Experience

7 Years

Description

Job Description


Must-Have :


  • 5+ Years of Experience in Data Engineering and building and maintaining large-scale data pipelines.
  • Experience with designing and implementing a large-scale Data-Lake on Cloud Infrastructure
  • Strong technical expertise in Python and SQL
  • Extremely well-versed in Google Compute Platform including BigQuery, Cloud Storage, Cloud Composer, DataProc, Dataflow, Pub/Sub.
  • Experience with Big Data Tools such as Hadoop and Apache Spark (Pyspark)
  • Experience Developing DAGs in Apache Airflow 1.10.x or 2. x
  • Good Problem-Solving Skills
  • Detail Oriented
  • Strong Analytical skills working with a large store of Databases and Tables
  • Ability to work with geographically diverse teams.


Good To Have


  • Certification in GCP service.
  • Experience with Kubernetes.
  • Experience with Docker
  • Experience with CircleCI for Deployment
  • Experience with Great Expectations.


Responsibilities


  • Build Data and ETL pipelines in GCP.
  • Support migration of data to the cloud using Big Data Technologies like Spark, Hive, Talend, Java
  • Interact with customers on daily basis to ensure smooth engagement.
  • Responsible for timely and quality deliveries.
  • Fulfill organization responsibilities : Sharing knowledge and experience with the other groups in the organization, and conducting various technical training sessions.


Location : Pune, Hyderabad, Remote


Education : Bachelors or Masters (preferably BE/B.Tech) - Computer Science/IT.


(ref:hirist.com)

Skills

Cloud InfrastructureDesigningEtlPythonQualitySqlTraining