Job Details

Big Data Engineer - Azure

NewWay Recruiting
Chicago, Illinois, United States

Seeking a candidate with strong skills in core areas of Big Data in Azure cloud environments: Data Lake File Systems, Apache Hadoop and it's ancillary distributions, ELK, Data movement and transformation tools such as Databricks and Azure Data Factory, and exposure to Machine Learning and AI tool sets. They should possess strong communication skills, strong analytical aptitude with critical thinking, a solid understanding of reporting / dashboarding capabilities, and the tools and platforms that support them.

The role requires advanced skills that enable the individual to deliver a high level of quality for ingesting, persisting and archiving vast amounts of enterprise data and to meet the expectations of the other teams within Global Data & Analytics.

Role Specific Responsibilities

  • Build, Test and Run of data assets tied to tasks and user stories from the Azure DevOps instance
  • Bring a level of technical expertise of the Big Data space that contributes to the strategic roadmaps for Enterprise Data Architecture, Global Data Cloud Architecture, and Global Business Intelligence Architecture
  • Actively participate in regularly scheduled contact calls to transparently review the status of in-flight projects, priorities of backlog projects, and review adoption of previous deliveries from management team.
  • Work to resolve any issues with the portfolio of deliveries that are a part of the Big Data team's work pipeline.
  • Act as the last line of defense of ensuring deliveries are of the appropriate level of quality for downstream work to continue or against user's expectations.

Knowledge Sharing / Documentation

  • Contribute to, produce and maintain processes, procedures, operational and architectural documentation
  • Change Control - ensure compliance with Processes and adherence to standards and documentation
  • Assist in mentoring other members of the team
  • Adoption - lead efforts to communicate and support overall adoption of new and or enhanced Big Data capabilities

Education (degree): Bachelor's Degree

Other (Explain): College Diploma in Computer Science, or equivalent industry experience

Years of Experience:

  • 5+ years of demonstrated delivery experience with technical knowledge Big Data environments and tool sets, particularly on Microsoft Azure, ELK and Apache Hadoop.

First Month Critical Outcomes:

  • Absorb strategic projects from the backlog and complete the related Big Data engineering work
  • Inspect existing run-state Big Data assets and identify optimizations for an potential development

Deliver new Big Data assets assigned as needed

- provided by Dice

Send application

Mail this job to me so I can apply later

Apply With CV

You are not logged in. If you have an account, log in to your account. If you do not have an account, why not sign up? It only takes a minute!

latest videos

Upcoming Events