Job Details

Data Architect /Hadoop Big Data

Logic Soft, Inc.
Columbus, Ohio, United States

Interview: Video Interview

Skills Required:

  • 8+ years Data analysis/architecture experience in Waterfall and Agile Methodology in various domains (prefer Healthcare) in a data warehouse environment.
    Good knowledge of relational database, Hadoop big data platform and tools, data vault and dimensional model design.
    Strong SQL experience (prefer Oracle, Hive and Impala) in creating DDL's and DML's in Oracle, Hive and Impala.
    Experience in analysis, design, development, support and enhancements in data warehouse environment with Cloudera Bigdata Technologies (Hadoop, MapReduce, Sqoop, PySpark, Spark, HDFS, Hive, Impala, Stream Sets, Kudu, Oozie, Hue, Kafka, Yarn, Python, Flume, Zookeeper, Sentry, Cloudera Navigator) along with Informatica.
    Experience in working with Sqoop scripts, PySpark programs, HDFS commands, HDFS file formats (Parquet, Avro, ORC etc.), Stream Sets pipelines, jobs scheduling, hive/impala queries, Unix commands, scripting and shell scripting etc.
    Experience in migrating data from relational database (prefer Oracle) to big data - Hadoop platform is a plus.
    Experience eliciting, analyzing and documenting functional and non-functional requirements.
    Ability to document business, functional and non-functional requirements, meeting minutes, and key decisions/actions.
- provided by Dice

Send application

Mail this job to me so I can apply later

Apply With CV

You are not logged in. If you have an account, log in to your account. If you do not have an account, why not sign up? It only takes a minute!

latest videos

Upcoming Events