Job Details

Big Data/Cloud Architect

Apex Systems
Minnesota, United States

Big Data/Cloud Architect

Contract - Remote

Primary Responsibilities:

  • Provide technical leadership and build in architecture, design and engineering in modernization of legacy Data Ingestion, ETL and Database to new technologies in the Public Cloud (AWS/Azure), Big Data and API Space
  • Lead the technology transformation of legacy Data and Analytic platforms to a Big Data Cloud based Modern Software paradigm. Be innovative in solution design and development to meet the needs of the business
  • Look across teams and products to find synergies and solutioning mindset to identify/remove duplication.
  • Actively mentor team members, through code reviews, brown bags, tech talks, design reviews
  • Influence org and org behaviors through tech talks, wiki pages, blogs, design discussions
  • Produce solutions that are models of performance, scalability, and extensibility
  • Create next generation streaming applications to use Data as a strategic platform to grow the top line revenue
  • Stay abreast of leading-edge technologies in the industry evaluating emerging software technologies
  • Solutions incorporate bounded autonomy that aligns with product and delivery org structures. You build the sandboxes in which everyone else plays in
  • Encourage the growth of junior engineers on the team through skills development, mentoring and technical guidance
  • Create a startup mentality to accelerate the introduction of new capabilities and transform teams
  • Set automation standards, implements tooling to support/report adherence
  • Understand security threats and patterns comprehensively and ensures adherence to all regulatory initiatives. HIPPA, PHI, PII, locational data restrictions, contractual, etc.
  • Stay current on new industry wide threat vectors and ensures processes are updated to avoid or remediate
  • Lead and develops security adherence mechanisms. Train the team on areas of risk
  • Own long-term technical roadmap ensuring alignment to org strategic initiatives (e.g no more thick clients, eliminating Oracle, etc.)
  • Required Qualifications:

  • Experience in ETL, Data warehousing concepts, Code management, automated testing
  • Development experience in Big Data/Cloud eco system - ability to design, develop, document & architect Hadoop/Cloud applications
  • Experience in Scala/Java/C# /NoSQL
  • Experience in Cloud computing - Azure, AWS, GCP
  • Data Pipeline/workflow management tools such as Airflow
  • Understand stream processing with knowledge on Kafka
  • Knowledge of software Engineering best practices with experience on implementing CI/CD using Jenkins
  • Knowledge of the Agile methodology for delivering software solutions
  • Azure Data Services, Synapse, Hadoop, MapR, Sqoop, Spark, Pig, Kafka, DevOps, Data Ops, Kubernetes, Kafka, SnowFlake
  • Preferred Qualifications:

  • Experience with tools like Chef, puppet, or Scripting knowledge of Terraform
  • Experience with Spark, Kafka
  • Healthcare domain knowledge
  • Send application

    Mail this job to me so I can apply later

    Apply With CV

    You are not logged in. If you have an account, log in to your account. If you do not have an account, why not sign up? It only takes a minute!

    latest videos

    Upcoming Events