Please verify your account first! Send OTP

Job Overview

Functional Area


Work preferred

Work from Office


Min Experience

3 Years

Max Experience

5 Years


Data Engineer (Python & Neo4j)

Location: Gurgaon

Experience: 3+ years

Employment Type: Full Time

Key Skills Required:

  • Python: Proficiency in Python programming language, including knowledge of its core concepts, libraries, and frameworks.
  • SQL: Strong understanding of SQL and experience in working with relational databases.
  • AWS: Familiarity with Amazon Web Services (AWS) and ability to utilize various AWS services effectively.
  • Airflow: Experience in using Apache Airflow, an open-source platform to programmatically author, schedule, and monitor workflows.
  • Neo4j: Expertise in working with Neo4j, a graph database, including data modeling, query optimization, and graph traversal techniques.


  • Develop and maintain data pipelines and ETL processes using Python, ensuring data integrity, quality, and performance.
  • Design, optimize, and manage databases, including data modeling, schema design, and query optimization for relational and graph databases.
  • Utilize SQL to query and manipulate data in relational databases effectively.
  • Leverage AWS services, such as S3, Redshift, and Glue, to build scalable and efficient data processing and storage solutions.
  • Implement and manage workflows using Apache Airflow to automate data pipelines and task scheduling.
  • Utilize Neo4js graph database capabilities to design and optimize graph data models, perform complex graph queries, and build data-driven applications.
  • Collaborate with data scientists and analysts to understand data requirements, provide data engineering support, and optimize data workflows.
  • Ensure data pipelines and ETL processes adhere to best practices in data security, governance, and compliance.
  • Monitor and troubleshoot data pipelines, identify and resolve issues, and propose performance optimization strategies.
  • Stay updated with the latest industry trends, technologies, and best practices in data engineering and graph databases.


  • Bachelors degree in Computer Science, Software Engineering, or a related field (or equivalent practical experience).
  • Proven work experience of at least 3 years as a Data Engineer, preferably with expertise in Python and Neo4j.
  • Strong understanding of data modeling, ETL processes, and data integration techniques.
  • Proficiency in SQL and experience in working with relational databases.
  • Familiarity with cloud platforms, particularly AWS, and experience with relevant services.
  • Knowledge of Apache Airflow or similar workflow management tools.
  • Excellent problem-solving and analytical skills, with the ability to handle large volumes of data and implement efficient data processing solutions.
  • Strong communication skills, both verbal and written, to effectively collaborate with team members and stakeholders.

Note: This job description provides a general overview and may be subject to modifications based on specific project requirements and organizational needs


Data IntegrationData ModelingData ProcessingEtlPythonQualitySql