Job Details

Big Data Architect

Cognizant Technology Solutions
Teaneck, New Jersey, United States
Cognizant is always looking for top talent. We are searching for candidates to fill future needs within the business. This job posting represents potential future employment opportunities with Cognizant. Although the position is not currently available, we want to provide you with the opportunity to express your interest in future employment opportunities with Cognizant. If a job opportunity that you may be qualified for becomes available in the future, we will notify you. At that time you can determine whether you would like to apply for the specific open position. Thank you for your interest in Cognizant career opportunities.

Cognizant is looking for Big Data Architect to join our Artificial Intelligence and Analytics practice (AIA). You are a trusted advisor, responsible for providing an approach for the overall project. As a subject matter expert, you will drive technology discussions and analyze the existing architecture for gaps in addressing business needs. You are a thought leader-comfortable challenging the status quo to enhance our current services and technologies.

About AI & Analytics : Artificial intelligence (AI) and the data it collects and analyzes will soon sit at the core of all intelligent, human-centric businesses. By decoding customer needs, preferences, and behaviors, our clients can understand exactly what services, products, and experiences their consumers need. Within AI & Analytics, we work to design the future-a future in which trial-and-error business decisions have been replaced by informed choices and data-supported strategies.

By applying AI and data science, we help leading companies to prototype, refine, validate, and scale their AI and analytics products and delivery models. Cognizant's AIA practice takes insights that are buried in data, and provides businesses a clear way to transform how they source, interpret and consume their information. Our clients need flexible data structures and a streamlined data architecture that quickly turns data resources into insightful, actionable intelligence.

Locations preferred but not limited to: Denver, Tampa, Chicago, Charlotte, Houston & Dallas, TX, San Francisco, Atlanta, NY/NJ

You must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future.

Position Responsibilities:
  • Manage data related requests, analyze issues and provide efficient resolution. Design all program specifications and perform required tests.
  • Prepare codes for all modules according to required specification.
  • Monitor all production issues and inquiries and provide efficient resolution.
  • Evaluate all functional requirements, map documents, and perform troubleshoot on all development processes.
  • Collaborate with application groups to prepare effective solutions for all programs.
  • Document all technical specifications and associates' project deliverables.
  • Design all test cases to provide support to all systems and perform unit tests.
  • Collaborate with other technology teams to work with business executives and end users to conceptualize new application projects, recommend technologies, design patterns and implementation strategies.
  • Help with onboarding Cloud And Data Engineering technologies - Spark, Kafka - Usage, Guidelines for design

Position Qualifications:
  • BS in Engineering or related field with at least 10 years professional IT experience
  • Experience in Data Platform architecture & design for Cloud Data Engineering or Big Data ecosystem.
  • Experience building scalable data ingestion frameworks and curation pipelines in any cloud platform.
  • Hands-on experience in Scala, Spark/PySpark programming.
  • Batch data ingestion strategy using Spark streaming and Kafka.
  • Experience defining programing constructs for scaling, container native execution
  • Good understanding of Security and the hooks to be built in for Enterprise adoption; awareness of the Data Security and governance tools interplay
  • Exposure to real time and event based data processing using Streaming - Dstreams/Structured Streaming in Spark
  • Experience in NoSQL using MongoDB or Cassandra
- provided by Dice

Send application

Mail this job to me so I can apply later

Apply With CV

You are not logged in. If you have an account, log in to your account. If you do not have an account, why not sign up? It only takes a minute!

latest videos

Upcoming Events