Designing Clusters

Apache Hadoop 1.0
  • 6 Videos | 35m 23s
  • Includes Assessment
  • Earns a Badge
Hadoop is a framework providing fast and reliable analysis of large data sets. Introduce yourself to supercomputing, and explore the design principles of using Hadoop as a supercomputing platform.

WHAT YOU WILL LEARN

  • describe the principles of supercomputing
    recall the roles and skills needed for the Hadoop engineering team
    recall the advantages and shortcomings of using Hadoop as a supercomputing platform
  • describe the three axioms of supercomputing
    describe the dumb hardware and smart software, and the share nothing design principles
    describe the design principles for move processing not data, embrace failure, and build applications not infrastructure

IN THIS COURSE

  • Playable
    1. 
    Defining Supercomputing
    5m 50s
    UP NEXT
  • Playable
    2. 
    Examining Engineering Teams
    5m 43s
  • Locked
    3. 
    Exploring Big Data Solutions
    6m 41s
  • Locked
    4. 
    Examining Axioms of Supercomputing
    6m 36s
  • Locked
    5. 
    Exploring Design Principles for Hadoop
    2m 57s
  • Locked
    6. 
    Examining Additional Design Principles
    4m 37s

EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE

Skillsoft is providing you the opportunity to earn a digital badge upon successful completion of this course, which can be shared on any social network or business platform

Digital badges are yours to keep, forever.

YOU MIGHT ALSO LIKE

Likes 3 Likes 3  
Likes 0 Likes 0  
Likes 4 Likes 4