Please verify your account first! Send OTP

Job Overview

Functional Area


Work preferred

Work from Office


Min Experience

5 Years

Max Experience

7 Years


Job Description

Primary Skills : Snowflake, DWH, Databricks, spark, python, sql.

Roles & Responsibilities:

• 5+ years of data engineering or general software development experience

Experience working with large datasets (terabyte scale and growing) and familiarity with various technologies and tooling

associated with databases and big data

• Relational DB (PostgreSQL/MySQL)

• Big Data - Snowflake, Pyspark

• Knowledge in AWS Data Ecosystem, including AWS S3 and AWS Lambda would be good

• Demonstrated proficiency in data design and data modeling

Experience in developing complex ETL processes from concept to implementation; these should include defining SLA,

performance measurements and monitoring.

Proficiency in query language and data exploration skills, proven record of writing complex SQL queries across large


Systems performance and tuning experience, with an eye for how systems architecture and design impacts performance

and scalability

• Strong software engineering principles

• Experience in functional programming in Python or in an equivalent language

• Self-paced, organized, and detail-oriented person with a strong sense of ownership

• Strong communication skills to effectively communicate with both business and technical teams.

• Ability to work in a fast-paced and dynamic environment

• Ability to break down complex problems into simple solutions.

• Strong interpersonal skills, intense curiosity, and an enthusiasm for solving difficult problems


Must have Skills:

• Bigdata

• Snowflake

• Airflow/ADF/SSIS


• Databricks

• Spark

• Python


Good to Have:

• AWS Data Ecosystem, including AWS S3 and AWS Lambda.


EtlPythonSnowflakeSoftware DevelopmentSql