Job Purpose This position is responsible for collaborating with Solutions Engineering, Infrastructure Operations, and Infrastructure Service Management teams in the design and build of infrastructure solutionsblueprints for the area of responsibility participating in the design and build of repeatable patterns (build-kits) to improve deployment times for non-prod and prod environments transitioning knowledge to Infrastructure Operations. Required Job Qualifications bull Bachelor's Degree and 5 years in Information Technology or relevant experience OR Technical Certification andor College Courses and 7-year Information Technology experience OR 9 years Information Technology experience. bull Operations Management. bull Experience or Advance Knowledge with HDFS, Spark, MapReduce, Hive, HBase, ZooKeeper, Impala, SOLR, Oozie, NiFi, Flink, Sqoop, Pig, MongoDB, KAFKA and Flume. bull Ability to simplify standardize complex concepts processes bull Understanding of business priorities (e.g., vision), trends (e.g., industry knowledge) and markets (e.g., existing planned) bull Oral written communications. bull Problem-solving analytical skills, tools, and techniques. bull Supplier management. bull Ability to prioritize and make trade-off decisions. bull Drive cross-functional execution. bull Adaptability and ability to introducemanage change. bull Teamwork and collaboration. bull Organized and detail-oriented. bull Analytical and problem-solving skills. Preferred Job Qualifications bull Bachelors Degree (Computer Science, MIS, or related degrees). bull 8+ years of experience with Big Data solutions and techniques. bull Candidate should be ready for on-call support if needed. bull 4+ years of Hadoop application infrastructure engineering and development methodology background. bull Experience with Ambari, Hortonworks, HDInsight, Cloudera distribution (CDH) and Cloudera Manager is preferred. bull Experience in the cloud (AzureAWS) big data solutions using EMR, HDInsight, Kinesis, Azure Event Hubs, etc. bull Experience with evaluating COTS applications. bull Strong understanding and experience with different SaaS, PaaS, IaC, IaaS, DBaaS cloud models. bull Strong knowledge in Cloud Data warehouses like Synapse, CDP, SQL Data warehouse, etc. bull Experience with multi-tenant platforms considering Data Segregation, Resource Management, Access control, etc bull Strong Programming Experience with Red Hat Linux, UNIX Shell Scripting, Java, Python, Scala, RDBMS, NoSQL, and ETL solutions. bull Experience with Kerberos, TLS encryption, SAML, LDAP. bull Experience with full Hadoop SDLC deployments with associated administration and maintenance functions. bull Experience developing Hadoop integrations for data ingestion, data mapping, and data processing capabilities. bull Experience with designing application solutions that make use of enterprise infrastructure components such as storage, load-balancers, 3-DNS, LANWAN, and DNS. bull Experience with concepts such as high-availability, redundant system design, disaster recovery and seamless failover. bull Expertise in common Hadoop file-formats including Avro, Parquet, and ORC. bull Experience with automation using Ansible, CloudFormation, ARM, PowerShell, etc. bull Experience distributed systemsdata lake and MPP databases capable of efficiently processing Terabytes of data using Teradata, Hadoop, Netezza, etc. bull Develop proofs-of-concept to benchmark key metrics for tools and architecture evaluation for big data and cloud technologies. bull Planning deploying new Hadoop Infrastructure, Upgrades, Cluster Maintenance, Troubleshooting, Capacity Planning, and Resource optimization as part of Infrastructure Operations. bull Data ingestion from Kafka, NIFI, and the use of such data for streaming analytics. bull Overall knowledge of Big Data technology trends, Big Data vendors, and products.