Please click on the Apply to verify the
status of jobs posted more than 15 days ago, as they may have expired. Similar
Jobs
Job Description
Experience: 4-6 yrsKey Responsibilities1 Design and develop Engineering solutions to aggregate and automate large scale data flows from varying sources2 Build ETLs from internal and external sources to provide insights into the business
3 Writing and tuning complex Data Pipelines, workflows, and data structures4 Help continually improve ongoing reporting and analysis processes, automating or simplifying operational support for stakeholders5 Create and maintain workflow/technical documents for all the development activities6 Experience in communicating with business stakeholders as well as with colleaguesTechnical Experience
Looking to get Placed? Try our Placement Guarantee Plan
1 Experience in industry big data platform stack like Databricks, Spark, Hadoop, Kafka, AWS DMS, AWS Fargate etc.2 Must have skills Spark, Kafka, Airflow, Python, Glue, functional programming,3 Deliver right solution architecture, automation and technology choices starting from experimentation and proof of concept phases of new analytical models4 Ability to design logical and physical data models, big data storage architecture, data modelling methodologies, metadata management AWS Serverless native services5 Strong experience on Data warehouse technologies, ETL, SQL Server
Skills
AutomationEtlPythonReportingSqlSql Server
If an employer asks you to
pay any kind of fee, please notify us immediately. Jobaaj does not charge any fee from the
applicants and we do not allow other companies also to do so.