Job Details

Big Data Engineer

Advertiser
Zenith Services Inc.
Location
Mc Lean, Virginia, United States
Rate
-
Title Data engineering Location McLean VA Duration 5+ months Must Have Mastery level of Big Data. Flexible on language Scala or Python but need expert level on either Python or Scala. Strong foundational REST background. Understanding of Spark and HDFS. Microservices working in serverless environment. REST API experience (experience connecting to APIs). Understanding of how messages and offset work of the Kafka stream. Cloud experience AWS preferred but experience with GCP or Azure will do too. Data Engineer Manager (Spark, Python, and AWS) The Finance Technology team at Capital One is searching for innovative and analytical Data Engineers to join our team. Our data engineers are multilinguists who can speak the languages of how we operate as a business, how that business impacts our financials, and the latest technologies that are reshaping our Finance Tech landscape. In this role, you will be responsible for building data pipelines frameworks using open source tools on public Cloud platforms. The right candidate for this role is someone who is passionate about technology, interacts with product owners and technical stakeholders, thrives under pressure, and is hyper-focused on delivering exceptional results with good teamwork skills. You'll bring solid experience in emerging and cutting-edge technologies such as Spark, ScalaJavaPython, Python, REST, JSON, NoSQL databases, relational databases, JenkinsMaven, and AWSCloud Infrastructure to name a few. Responsibilities Provide technical guidance concerning business implications of application development projects Leverage ETL programming skills in open source languages including Python, Scala, and SQL on various frameworks especially Apache Spark Deploy DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test-Driven Development to enable the rapid delivery of working code utilizing tools like Jenkins, Nexus, Github, and Docker Experience with Cloud computing, preferably AWS and its services including S3, EMREC2 and Lambda functions Manage multiple responsibilities in an unstructured environment where you're empowered to make a difference. In that context, you will be expected to research and develop cutting edge technologies to accomplish your goals Must have a firm understanding of delivering large-scale data sets solutions and SDLC best practices Basic Qualifications Bachelor's degree At least 3 years of experience delivering data solutions using open-source languages At least 3 years managing people teams At least 3 years of experience utilizing scripting languages At least 4 years of experience developing ETL solutions At least 4 years of experience utilizing SQL in data solutions Preferred Qualifications 3+ years of experience delivering data solutions using open-source languages 5+ years of experience utilizing SQL in complex data solutions 3+ years of experience utilizing scripting languages 5+ years of experience developing ETL solutions 5+ years of experience utilizing SQL in complex solutions 2+ years of experience developing, deploying, testing in AWS public cloud AWS certification

Send application

Mail this job to me so I can apply later

Apply With CV

You are not logged in. If you have an account, log in to your account. If you do not have an account, why not sign up? It only takes a minute!

latest videos

Upcoming Events