Job ID T4366 - AWSPublic Cloud Big Data Developer This person must be willing to relocate to Hartford once the WFH mandate has lifted. ------------ Currently we have a gap with employee developers who have AWSPublic Cloud and Big Data experience Minimum 7 years of hands on software development experience with extensive background and concentration in AWS, Cloud Foundry, andor other Public Cloud development experience is required. Experience with Kubernetes, Docker Knowledge of HadoopBig Data and the Hadoop ecosystem required - Proven experience within Hortonworks Hadoop ecosystems (MapReduce, HDFS, YARN, Hive, HBase, Sqoop, Pig, Hue, Spark, Kafka, etc.) Design, implement and deploy custom applications on Hadoop using PythonPySpark Person hired will develop and implement AWS solutions using EC2, S3, RDS, Redshift, Snowflake, Application Load Balancer, Cloud Watch, andor other supporting cloud technologies. Troubleshoot production issues within the CloudHadoopBig Data environment Performance tuning of AWSBig Data processes and applications Must be proficient in SQLHiveQL Hands on expertise in LinuxUnix and scripting skills are required. Working knowledge of Microservice-Based Service-Oriented Architecture (SOA) Good knowledge on Agile Methodology and the Scrum process. Promote the development and quick deployment of Minimal Viable Product (MVP) applications to have real and timely impact to business unitsmdashmust be comfortable pivoting mid-course to address new information and adapt to the changing realities of a dynamic environment Strong communication, technology awareness and capability to interact work with senior technology leaders is a must Experience with ETL and Data Integration tools such as Ab Initio, Talend, Informatica is preferred.