Work from Office
EPAM’s global multi-disciplinary teams serve 57,450+ employees and customers in more than 50 countries across six continents.
As a recognized leader, EPAM is listed among the top 15 companies in Information Technology Services on the Fortune 1000 and ranked as the top IT services company on Fortune’s 100 Fastest-Growing Companies list for the last three consecutive years.
EPAM is also listed among Ad Age’s top 25 World’s Largest Agency Companies and in 2020, Consulting Magazine named EPAM Continuum a top 20 Fastest-Growing organization.
- Design, developing and maintaining the data architecture, data models and standards for various Data Integration & Data Warehousing projects in GCP cloud, combined with other technologies
- Ensure the use of Big Query SQL, Java/Python/Scala and Spark reduces lead time to delivery and aligns to overall group strategic direction so that cross-functional development is usable
- Ownership of technical solutions from design and architecture perspective, ensure the right direction and propose resolution to potential data pipeline-related problems.
- Expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams.
- Provide technical guidance and support to a vibrant engineering team. Coaching and teaching your teammates how to do great data engineering.
- A deep understanding of data architecture principles and data warehouse methodologies specifically Kimball or Data Vault.
- An expert in GCP, with at least 8-12 years of delivery experience with: Dataproc, Dataflow, Big Query, Compute, Pub/Sub, and Cloud Storage
- Highly knowledgeable in industry best practices for ETL Design, Principles, and Concepts
- Equipped with 3 years of experience with programming languages – Python
- A DevOps and Agile engineering practitioner with experience in a test-driven development
- Experienced in the following technologies: Google Cloud Platform, Dataproc, Dataflow, Spark SQL, Big Query SQL, PySpark and Python/Scala
- Experienced in the following BigData technologies: Spark, Hadoop, Kafka etc..
- Preferred experience in Insurance/BFSI domain.
- Big Data
- Insurance Coverage
- Paid Leaves – including maternity, bereavement, paternity, and special COVID-19 leaves.
- Financial assistance for medical crisis
- Retiral Benefits – VPF and NPS
- Customized Mindfulness and Wellness programs
- EPAM Hobby Clubs
- Hybrid Work Model
- Soft loans to set up workspace at home
- Stable workload
- Relocation opportunities with ‘EPAM without Borders’ program
- Certification trainings for technical and soft skills
- Access to unlimited LinkedIn Learning platform
- Access to internal learning programs set up by world class trainers
- Community networking and idea creation platforms
- Mentorship programs
- Self-driven career progression tool