Please verify your account first! Send OTP

Job Overview

Functional Area


Work preferred

Work from Office


Min Experience

3 Years

Max Experience

8 Years


Were searching for a Data Platform Engineering Analyst to join Telstras Data team in India. We are open to you being based in either Hyderabad, Pune or Bangalore.
As a Data Platform Engineering Analyst, you bridge the hardware-software boundary. You collaborate to create and maintain our data and automation platforms that power data integrations so that our users can access highly reliable data assets and solutions. As a member of the team and data custodian, you will have a big impact on supporting the delivery of customer value in the data integration and platform extension space.

We love to work in a collaborative way; youll have the chance to come into the office from time to time, to work with other areas of Data and even other business units, sharing ideas, talking Data and tech, with Brown Bag sessions/ lunch and learns and just generally bonding with your teammates.

When joining Telstra, youll have the chance to professionally develop, as we have dedicated People Development days and we have just won a Cloud Innovator Award from Nasscom. Now really is a great time to join the Data team, as weve seen growth of around 40% in the last year, with similar growth to follow in this new financial year; giving you the chance to be part of something really special.

Key Responsibilities

  • Contribute to the design and delivery of data platforms across multiple projects and functional areas to enable the execution of the Data Strategy
  • Work with Enterprise architects and Data Solution architects to ensure integration between systems and data models in place
  • Contribute to proposals for better system designs/architecture to build reliable data flows, combining analytical batch processing, real-time data flows and low-latency APIs
  • Use your knowledge within data engineering to contribute to the management and maintenance of data and automation platforms through their life cycles
  • Create and maintain CI/CD pipelines for enabling automated build, test, and deployment of the code
  • Identify and automate recurring platform administration and maintenance tasks to drive efficiency and effectiveness
  • Identify and remediate technical debt for meeting scalability and security requirements
  • Actively contribute to a culture of continuous delivery and agile development by supporting team-based planning activities, writing scrum team stories, and collaborating with other scrum team members to estimate and deliver work inside of a sprint.

What youll need:

  • Experience working as a Cloud data platform engineer or in similar roles for a minimum of 3 to 8 years, preferably in the IT industry
  • Basic understanding of ETL, ELT, and Data warehousing
  • Understanding of security concepts. firewalls, security policy, access control, identity management, patch management, and infrastructure in general.
  • Build and maintain cloud Data platform components like ADLS Gen2, Azure Synapse, Azure Cosmos, Eventhubs, ADF, and Azure Databricks
  • Linux and Windows Administration
  • Scripting experience - Python, Bash, PowerShell, or similar
  • Expertise in Unix and DevOps automation tools in deploying applications to at least one of the cloud providers such as Azure, AWS, or GCP
  • Administration of Big Database Technologies Hadoop, Hive, Spark, and Impala is a plus
  • Experience as a DBA for MPP systems such as Teradata/Azure Synapse/Redshift is a plus

About you

  • To be able to configure Azure Data Components, such as below, to deliver scalable, secured and performant platform for delivering Data related use cases (ADLS Gen2, Azure Synapse/SQL DW, Datafactory, Databricks. KeyVault, Cosmos, Azure Functions, Azure Eventhubs, Log Analytics,Azure Monitor
  • Experience administering/maintaining on Bigdata/MPP systems like Hadoop, Azure Synapse
  • Experience administering/maintaining Spark clusters on YARN/Kubernetes/Azure Databricks
  • Able to identify and automate recurring admin tasking using scripting technologies such as UNIX/PowerShell/Python
  • Design, develop and maintain DevOps tools like - (Nice to have)

a. Bitbucket/Gitlab/Azure DevOps
b. Bamboo/Azure Pipelines/GitLab CI
c. Define branching strategy
d. Enable blue/green deployments and feature flagging using CI/CD tools
e. Exposure to code scanning tools like SonarQube, SourceClear, Coverity

AZ-104, AZ-400, DP-300, Databricks, Platform Administrator

Why join us?
A career in one of our technology teams will give you the exciting responsibility of using technology, automation and innovation to try and solve the worlds biggest technological challenges in areas such as Internet of Things (IoT), 5G, Artificial Intelligence (AI), Machine Learning, and more.

Were committed to building a diverse and inclusive workforce in all its forms. We encourage applicants from diverse gender, cultural and linguistic backgrounds and applicants who may be living with a disability. We also offer flexibility in all our roles, to ensure everyone can participate.

To learn more about how we support our people, including accessibility adjustments we can provide you through the recruitment process, visit


AgileAnalyticsArtificial IntelligenceAutomationData IntegrationData WarehousingEtlInnovationMachine LearningPythonRecruitmentScrumStrategyTeradataUnix