Job Description
- Design, develop, and implement ETL (Extract, Transform, Load) workflows using Informatica PowerCenter & Informatica Developer to support data integration and data warehousing initiatives.
- Optimize and tune Informatica mappings, sessions, and workflows for performance and scalability.
- Collaborate with data architects, analysts, and business stakeholders to gather requirements and translate them into technical specifications.
- Ensure data quality, integrity, and security across all data integration processes.
- Troubleshoot and resolve issues in ETL processes, providing root cause analysis and implementing solutions.
- Lead the migration of legacy ETL processes to modern Informatica solutions or cloud-based platforms (e.g., Informatica Cloud).
- Mentor junior engineers and provide technical guidance on best practices for Informatica development.
- Maintain documentation for ETL processes, data flows, and system architecture.
- Stay updated on Informatica product advancements and industry trends to recommend innovative solutions. Qualifications
- Bachelors degree in Computer Science, Information Systems, or equivalent experience.
- 5+ years of experience in ETL development with Informatica PowerCenter or Informatica Cloud.
- Proficiency in designing and developing complex mappings, transformations, and workflows.
- Strong knowledge of SQL, PL/SQL, and relational database systems (e.g., Oracle, SQL Server, MySQL).
- Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift).
- Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and Informatica Cloud (IICS) is a plus.
- Expertise in performance tuning and optimization of ETL processes.
- Strong understanding of data modeling, data integration, and data governance principles.
- Excellent problem-solving, communication, and leadership skills.
- Informatica certifications (e.g., PowerCenter Developer, Data Integration) are highly desirable. Preferred Skills
- Experience with other ETL tools (e.g., Talend, SSIS) or big data technologies (e.g., Hadoop, Spark).
- Knowledge of scripting languages (e.g., Python, Shell) for automation.
- Experience with Power BI.
- Experience with Agile development methodologies
Looking to get Placed? Try our Placement Guarantee Plan
Skills
Big DataPythonData GovernanceData IntegrationData ModelingData WarehousingData Warehousing ConceptsEtlMysqlSnowflakeRoot Cause AnalysisAnalyticsGoogle CloudSqlIf an employer asks you to pay any kind of fee, please notify us immediately. Jobaaj does not charge any fee from the applicants and we do not allow other companies also to do so.
Important dates & deadlines?
Application Deadline
28 Mar 26, 05:25 PM IST
Similar Jobs
View All



