Please click on the Apply to verify the status of jobs posted more than 15 days ago, as they may have expired. Similar Jobs
Job Description
We are seeking a talented and motivated Data Engineer to join our founding team in India. This role is critical in shaping the data foundation of the Axyo AI platform and our enterprise applications. As an early member of our India team, you will work closely with the VP of Data Science, ML Engineers, and DevOps Engineers to design and implement robust data pipelines, scalable infrastructure, and high-quality data systems that power our AI products.
Responsibilities
Data Infrastructure & Pipelines
- Design, develop, and maintain scalable, eff icient, and reliable automated ETL pipelines.
- Integrate diverse structured and unstructured data sources into unifi ed data models.
- Implement data quality checks, monitoring, and validation frameworks to ensure trustworthy datasets.
- Build streaming and batch data pipelines to serve both real-time and analytical use cases.
- Contribute to customer implementations by building data ingestion, transformation, and integration workfl ows.
Technical Excellence & Best Practices
- Implement and advocate for engineering best practices in data architecture, testing, versioning, and CI/CD.
- Develop and manage APIs for data ingestion, transformation, and integration workfl ows.
- Ensure adherence to data governance, privacy, and security standards across all pipelines.
- Monitor and optimize performance, scalability, and cost-eff iciency of data infrastructure.
- Participate in code reviews and architecture discussions to raise the bar on technical quality.
Requirements
- Bachelors or Master’s degree in Computer Science, Data Engineering, or a related technical fi eld.
Looking to get Placed? Try our Placement Guarantee Plan
- 2–5 years of hands-on experience in data engineering, preferably in fast-paced product or platform teams.
- Strong programming skills in Python (knowledge of SQL, SQLAlchemy, PySpark).
- Experience with AWS cloud services (S3, Sagemaker, SQS, Lambda, Glue, EMR, RDS).
- Experience with data pipeline frameworks such as Airfl ow, AWS Glue, Step Functions, Lambda-based pipelines (experience with Prefect/Dagster a plus).
- API development and management with Python (e.g., FastAPI, Flask, Django REST Framework).
- Profi ciency with relational and NoSQL databases (e.g., PostgreSQL, MySQL, DynamoDB).
- Hands-on experience with big data and streaming technologies (e.g., Spark, Kafka).
- Excellent problem-
Skills
Api Design And DevelopmentAws StorageETLData IngestionPysparkAWS GluPostgresqlFast ApiIf an employer asks you to pay any kind of fee, please notify us immediately. Jobaaj does not charge any fee from the applicants and we do not allow other companies also to do so.
Important dates & deadlines?
Application Deadline
24 Nov 25, 04:05 PM IST
Similar Jobs
View All

