Big Data Platform Engineer - West London - GCP, Hadoop, Terraform
I am working with a telecommunications giant based in West London to recruit an experienced Big Data Platform Engineer with experience of the Google Cloud Platform and Hadoop. You will play a senior role in defining and delivering their world-class next generation Google Cloud-based Data Products that leverages massive datasets, visualisations and Machine Learning within digital apps.
Your role will be to work as part of their Cloud team to deliver their Google Cloud infrastructure and software deployment automation across multiple projects. You will be working on large enterprise scale systems that deliver local data to a central platform hosted in the cloud. They aim to produce a self-healing solution that reduces the need for operational support by over 70%. We are looking for a Big Data Platform Engineer who combines skills in a DevOps approach, Big Data technologies and programming skills.
To be considered your CV will outline your skills and experience in most of the following:
• Strong skills with the Google Cloud Platform - Data Fusion, Data Flow, BigQuery etc.
• Infrastructure-as-code (IaC) - Terraform
• Strong skills in Hadoop ecosystem (Spark, Hive/Impala, HBase, Yarn);
• Strong software development experience in Java, Scala and Python programing languages or other functional programming languages
• Expertise in PaaS, CaaS and IaaS tools and environments, particularly with Google Cloud Platform (GCP).
• Jenkins, Gitlab/Github, Nexus or equivalent CI/CD tools
• Scripting for automation - Bash, Python, Ruby etc
• EDW or Data Warehouse solutions
If you have the skills and experience and want to work for a global player in the communications and technology market please APPLY NOW.
Salary: £80,000 + good benefits
Location: West London - 3 days remote working after the pandemic