Job Details

Hadoop ETL Developer - Senior Big Data ETL Developer

Northamptonshire, United Kingdom
. You will design, develop and maintain the ETL processes used to support the enterprise data environment that consist of Big Data technologies.
You will help building data connectors and interfaces for our stakeholders to consume the data from data lakes/warehouses.
You will design, scalable and secure data solution that will be consumed by business applications such as statistical models or MI reports.
You work collaboratively with a cross-functional team of ETL developers, Business Intelligence designers, architects, business analysts and infrastructure engineers.

Key Accountabilities
You will provide technically sound solutions for the ingestion, storage and presentation of enterprise data including ETL design, data storage strategies, data access and security in line with Business requirements/processes and create Analytical Ready Data.
Abinito ETL Designer, you will be able to develop complicated Abinito ETL processes to turn business requirements into Technology data solutions.
You will drive the design of scalable solutions while considering recoverable and resiliency requirements.
You will employ agile techniques such as task estimation, test automation, deployment automation and continuous integration to enhance overall execution speed and product quality.
You will convert source to target mappings into ETL code.
You will collaborate with the data modellers and contribute to the physical data model design.
You will ensure to work in line with Global Data Management Standards and follow all Governance Controls as required for your work.
SOX/DQ - Performing gap analysis between existing SOX/DQ controls and the data governance/data reconciliation.
Lineage- Work with SOX Tech delivery team and ensure the lineage captured and Reconciliation between hops are documented.
Design and implement physical data and metadata repositories in Hive/Impala/HBase (as appropriate) corresponding to logical data models
Design and implement data service layer in Hadoop
Replicate data from current RDBMS based systems to the RFT Hadoop Data Lakes
Utilise Dev-Ops framework during the development activities

Person Specification

Essential Skills/Basic Qualifications:
Passion for programming and software development
More than 5 years of development experience in ETL tools like Abinitio
Knowledge of Hadoop Ecosystem- Cloudera distribution is preferred
Knowledge of all Hadoop components mainly HDFS, Zookeeper, YARN, HBASE, Hive, Impala, SparkSQL
Working knowledge on Oracle, MySQL and Teradata databases.
Strong Unix understanding. Python and/or Shell Scripting preferred - knowledge on at least two different Scripting languages
Experience in creating Narrative documents for Governance to prove existing Controls work.
Experience in Datamodelling with good SQL skills in at least 1 enterprise grade RDBMS
Experience in Test/Behaviour Driven Development (including test automation and mocking tools)
Strong project and stakeholder management skill set.
Has an eye for data quality issues and providing solution to resolve and remediate

Desirable skills/Preferred Qualifications:
Excellent Communication skills, articulate and detail oriented.
Strong Stakeholder management ability, experienced in managing high severity incidents and working under pressure.
Excellent interpersonal skills - exemplary team player and able to mentor junior members of the team.
Awareness of different cultural background of users and team members
Ability to work under pressure to deliver to committed timescales
Familiarity or experience with Amazon AWS Cloud is an advantage
Experience in building large structured and semi structured data stores.
Ability to build productive working relationships with other IT groups, business users and support teams.
Excellent problem solver.
Professional presentation and communication skills.
Detailed accountancy/Finance knowledge to facilitate the right transformation strategies.
Previous experience within Finance, Treasury and Risk functions
Credit Risk banking experience/exposure
Regulatory experience (Impairment, Capital, BCBS239, etc.)
DevOps experience
Experience working with Consumer, Risk or Finance departments

Send application

Mail this job to me so I can apply later

Apply With CV

You are not logged in. If you have an account, log in to your account. If you do not have an account, why not sign up? It only takes a minute!

latest videos

Upcoming Events