Job Description
Responsibilities
- Design and Build Data Warehouses: Architect scalable and efficient data warehouses that support analytics and reporting needs.
- Develop and Optimize ETL Pipelines: Write complex SQL queries and use tools like Apache Airflow to automate data pipelines.
- Query Optimization and Performance Tuning: Write efficient SQL queries for ETL jobs as well as for dashboards in BI tools.
- Database Management: Work with MySQL, PostgreSQL, and Spark to manage structured and semi-structured data.
- Data Quality and Governance: Ensure data accuracy, consistency, and completeness through validation, monitoring, and governance practices.
- Implement Data Governance Best Practices: Define and enforce data standards, access controls, and policies to maintain a well-governed data ecosystem.
- Data Modelling and ETL Best Practices: Ensure robust data modelling and apply best practices for ETL development.
- BI and Dashboarding: Work with BI tools such as Power BI, Tableau, and Apache Superset to create insightful dashboards and reports.
- Propose and Implement Solutions: Identify and propose improvements to existing systems and take ownership of designing and developing new data solutions.
- Collaboration and Problem Solving: Work independently, collaborate with cross-functional teams, and proactively troubleshoot data challenges.
- Strong understanding of data warehouse and data lake concepts.
- 3-7 years of experience in the data domain (data engineering + BI).
- Strong SQL skills with expertise in writing efficient and complex queries.
- Hands-on experience with data warehouse concepts and ETL best practices.
- Proficiency in MySQL, PostgreSQL, and Spark.
- Experience with Python and building pipelines in Airflow or similar tools.
- Strong understanding of data modeling techniques for analytical workloads.
Looking to get Placed? Try our Placement Guarantee Plan
- Experience with Power BI, Tableau, or Apache Superset for reporting and dashboarding.
- Experience with data quality frameworks, data validation techniques, and governance policies.
- Ability to work independently, identify problems, and propose effective solutions.
- Experience building real-time pipelines is preferred.
- Experience handling multi-tenant data is a plus.
- Bonus: Experience with DBT for data transformations.
- Should be enthusiastic to learn new concepts and technologies and be able to implement them with minimal supervision.
- Strong adherence to best practices in coding and database/visualization development.
- Excellent problem-solving skills, attention to detail, and communication skills.
Skills
Data ValidationPythonDashboardingData GovernanceData ModelingData WarehousingEtlMysqlVisualizationData EngineerAnalyticsSqlIf an employer asks you to pay any kind of fee, please notify us immediately. Jobaaj does not charge any fee from the applicants and we do not allow other companies also to do so.
About Company
Important dates & deadlines?
Application Deadline
03 Apr 26, 04:47 PM IST
Similar Jobs
View All



