Job Details

Big Data Developer/Hadoop Developer ( W2)

Analysts International
Columbus, Ohio, United States

Data distribution team - distributing consumer data within the LOB

  • Using Ab Initio, but moving to spark/scala cloud architecture (candidate needs to focus on Spark/Scala/Docker/Kafka) - This is a modernization effort
  • In order of skillset preference;
  • 1, Spark/Scala - Must have
  • 2, SQL - Must have
  • 3, Kafka
  • 4, Docker containerization/Kubernetes

MUST WORK ON-SITE AFTER COVID Role Responsibilities:

ETL Software Engineer for ESDD team within Core Banking. Deep understanding of data architecture, ETL, and query optimization is required.

Strong skills in ETL (Spark-Scala preferred), Cloud (AWS), Kafka, Kubernetes, Docker, SQL, Unix, Mainframe, security.

- provided by Dice

Send application

Mail this job to me so I can apply later

Apply With CV

You are not logged in. If you have an account, log in to your account. If you do not have an account, why not sign up? It only takes a minute!

latest videos

Upcoming Events