Security Cleared Big Data Engineer
Salary: Competitive DOE + Benefits
Location: Croydon / Kent and WFH
For this role we are looking for a Big Data Engineer who is SC Cleared. We would also consider candidates who should be eligible for SC Clearing, Ideally we are looking to fill this as a perm role, however we may consider experienced Big Data Engineers who are currently SC cleared and available immediately on a contract basis.
Our client is looking to recruit a driven and highly logical Big Data engineer / DevOps with a proven track record of developing or implementing data related technologies on time, to specification and to quality assured standards. The ideal candidate will have roots in Java Services, Hadoop ecosystems and Amazon Web Services and will be Security Cleared, or eligible, up to a minimum of SC clearance level.
As a Big Data Engineer, you will have previous experience in the configuration of Hortonworks Data platform (HDP) with experience of services such as Ambari, HDFS, HBase MapReduce, Yarn, Sqoop, Oozie, Zookeeper, Spark, and Kafka. As an experienced Big Data Engineer you will also have experience in using automation tools such as Ansible to automate Hadoop Cluster configurations, deployments and patches.
The role will involve the setup of CI/CD for Java and python based services/applications, using tools such as Jenkins or AWS drone, and so experience of these technologies would also be beneficial.
The role will be as part of a wider team focused on providing a Big Data solutions to their customers. The successful candidate will be required to administrate and optimise multiple kerberized Hadoop clusters across a number of AWS environments which will require cross team collaboration; supporting your colleagues to optimise their CI/CD pipelines to take code releases from dev to production.
To apply you must:
Have 2 years of practical experience of the Hortonworks Data Platform.
Be either a Hortonworks certified admin or working towards this or equivalent certification.
Be experienced in Java services and Hadoop ecosystems such as Ambari, HDFS, HBase, MapReduce, Yarn, Sqoop, Oozie, Zookeeper, Spark and Kafka.
Have good working experience with Amazon Web Service components like EC2, S3 and Redshift.
Have a current Security Cleared Status of at least SC level or be eligible for UK Security Clearance at SC level.
Be educated to graduate level or a professional equivalent, demonstrating a sound grasp of computer science.
Be a good communicator, including written and verbal communication, and a good listener, with the ability to grasp complex business situations and represent them technically.
Be experienced in structured problem solving. Possibly by being trained and or certified in a suitable methodology.
Be willing to travel throughout Europe as projects require...... click apply for full job details