Job Details

Data Engineer-data pipelines GCP infrastructure

Grand Rapids, Michigan, United States
Description This role will be working directly with the data lake team to build out the data platform infrastructure as well as the data pipelines to move data from the source into the data platform (GCP). This developer will be expected to make improvements to the already existing CDC, File, and API based pipelines. They will also be expected to help build out the governance strategy on the data platform. Must Haves MUST have a data engineering background working with data pipelines P infrastructure Any candidate should have a willingness to learn and the ability to pick up new concepts quickly. They must be able to build tools incrementally instead of in a ldquobig bangrdquo manner. Developers must be an advocate for the business value behind the technical work. Experienced in an Agile development environment (SAFe would be a bonus) GCP Data Analytics Experience Terraform Dataflow (Java) J-Unit testing Bigquery Google Cloud Storage PubSub Cloud Functions (Python) PyTest Stack Driver The ability to write modularized and testable code GIT repository (preferably Bitbucket) Nice to haves Qlik Replicate Firestore Jenkins Visualization tools Microstrategy Datastudio Google Data Catalog GCP Cloud Monitoring

Send application

Mail this job to me so I can apply later

Apply With CV

You are not logged in. If you have an account, log in to your account. If you do not have an account, why not sign up? It only takes a minute!

latest videos

Upcoming Events