Job Details

Senior Infrastructure Engineer (Apache Spark)

Artech Information Systems
Seattle, Washington, United States
Job Title Spark Infrastructure Engineer Location Seattle, WA (Initially will start as remote but must sit in Seattle office once the pandemic restrictions are lifted) Duration 6 months Job Description Must-have requirements Experience in provisioning and operationally supporting Spark cluster on top of Kubernetes (EKS preferred) Experience working with AWS Experience using "infrastructure as code" tools like Terraform, or Pulumi At least 5 years working in data infrastructure area Nice-to-have requirements Hadoop knowledge and experience Batch processing related technologies and data formats, e.g. Hive, Iceberg, Hudi, Avro, Parquet, etc. Apache Spark committer (someone who knows the internals of Spark)

Send application

Mail this job to me so I can apply later

Apply With CV

You are not logged in. If you have an account, log in to your account. If you do not have an account, why not sign up? It only takes a minute!

latest videos

Upcoming Events