Big Data Engineer

  1. Full Time
  2. IT/Technology
Posting date:18 Dec, 2018


The Big Data Engineer provides expert guidance and delivers through self and others to:


  1. Integratethe necessary data from several sources in the Big Data Programme necessary foranalysis and for Technology actions;

  2. Build applications that make use of large volumes of data and generate outputs that allow commercial actions that generate incremental value

  3. Deliver and implement core capabilities(frameworks, platform, development infrastructure, documentation, guidelines and support) to speed up the Local Markets and tenants delivery in the Big Data Programme, assuring quality, performance and alignment to the Group technology blueprint of components releases in the platform

  4. Support local markets, tenants and Group functions in obtaining benefiting business value from the operational data.

Key accountabilities and decision ownership: 
• Designing and producing high performing stable end-to-end applications to perform complex processing of batch and streaming massive volumes of data in a multi-tenancy big data platform, both Hadoop on-premises and in the cloud, and output insights back to business systems according to their requirements.
• Design and implement core platform capabilities, tools, processes, ways of working and conventions under agile development to support the integrationof the LM and tenant’s data sourcing and use cases implementation, towards reusability, to easy up delivery and ensure standardisation across Local Markets deliverables in the platform.
• Support the distributed data engineering teams, including technical support and training in the Big Data Programme frameworks and ways of working, revision and integration of source code, support to releasing and source code quality control
• Working with the Group architecture team to define the strategy for evolving the Big Data capability, including solution architectural decisions aligned with the platform architecture
• Defining the technologies to be used on the Big Data Platform and investigating new technologies to identify where they can bring benefits 

Core competencies, knowledge and experience:
• Expert level experience in designing, building and managing applications to process large amounts of data in a Hadoop ecosystem orother big data frameworks
• Extensive experience with performance tuning applications on Hadoop and configuring Hadoop systems to maximise performance or other big data frameworks;
• Experience building systems to perform real-time data processing using Spark Streaming, Flink, Storm or Heron data processing frameworks, and Kafka, Beam, Dataflow, Kinesis or similar data streaming frameworks;;
• Experience with common SDLC, including SCM, build tools, unit, integration, functional and performance testingfrom automation perspective, TDD/BDD, CI and continuous delivery, under agile practises
• Experience working in large-scale multi tenancy big data environments;

Key performance indicators:
• Development of core frameworks to speed up and facilitate integration of Local Markets developments in the BDP
• Speed of on-boarding data sources and use cases for EU Hub markets and new tenants
• Delivered integrated use cases from Local Markets and tenants to add value to the business using the Big Data Program

Life at Vodafone

Raniya's Work Experience

James "Vodafone changed my life"

James "Vodafone changed my life"