AWS Big Data Developer #1705
In this position, you will be responsible for data management tasks including design, development, and maintaining applications for constructing and optimizing the Data Lake and date pipeline architecture using AWS Big Data Tools, Spark/Scala, and Python. As a key contributor, you will help multiple teams, systems, and products by monetizing the data through Machine Learning algorithms and creating prototypes for predictive models. If you enjoy optimizing data systems and building them from the ground up, this position is for you!
Experience and Education:
- Bachelor's degree in Computer Science or related field
- Experience with manipulating and transforming data
- 3+ years of IT experience in Data Warehouse/Big Data/BI tools and technologies
- 3+ years of experience in developing applications in Big Data platform (AWS)
Skills and Strengths:
- Hadoop ecosystem
- Data visualization tool (Snowflake)
- Apache spark
- Data Lake
- Data Pipeline Architecture
- Pipeline Builder
- Big Data Infrastructure
- Optimizing Data Systems
- Machine Learning Algorithms
- Windows Operating Systems
Primary Job Responsibilities:
- Build distributed Big Data solutions including ingestion, caching, processing, consumption, logging & monitoring.
- Develop technical and user-focused documentation (data models, data dictionaries, business glossaries, process flows, architecture).
- Build and incorporate automated unit tests and participate in integration testing efforts.
- Develop application and custom integration solution using spark/Scala.
- Develop Snowflake deployment and usage best practices.
- Understand the existing environment and translating elements into new environment requirements and planning tasks.
- Work in state-of-the art programming languages and utilize object-oriented approaches in designing, coding, testing, and debugging programs.
- Deliver high quality code deliverables for a module, lead validation for all types of testing and support activities related to implementation, transition, and warranty.
- Understand specifications, plan, design and develop software solutions, adhering to process - either individually or collectively within a project team.
- Define data/information architecture standards, policies, and procedures for the organization.
- Build data pipeline frameworks to automate high-volume and real-time data delivery for our Data Lake and Data pipeline architecture.
- Ensure optimal data delivery architecture is consistent throughout ongoing projects.
- Continuously integrate and deploy code into cloud environments.
- Must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products.
Ranger Technical Resources is an information technology firm based in Ft. Lauderdale that has been providing IT solutions to South Florida customers since 1996. We are a unique firm in that we have 2 groups that support both our client's needs for individual IT professionals (contract or direct hire) and technology implementation services.
Job Type: Perm
Education: Bachelor's Degree or equivalent experience
Experience: 3+ years
Job Reference #: 1705
Location: Miami, FL
- provided by Dice