Designing Clusters
Apache Hadoop 1.0
| Intermediate
- 6 Videos | 32m 23s
- Includes Assessment
- Earns a Badge
Hadoop is a framework providing fast and reliable analysis of large data sets. Introduce yourself to supercomputing, and explore the design principles of using Hadoop as a supercomputing platform.
WHAT YOU WILL LEARN
-
describe the principles of supercomputingrecall the roles and skills needed for the Hadoop engineering teamrecall the advantages and shortcomings of using Hadoop as a supercomputing platform
-
describe the three axioms of supercomputingdescribe the dumb hardware and smart software, and the share nothing design principlesdescribe the design principles for move processing not data, embrace failure, and build applications not infrastructure
IN THIS COURSE
-
1.Defining Supercomputing5m 50sUP NEXT
-
2.Examining Engineering Teams5m 43s
-
3.Exploring Big Data Solutions6m 41s
-
4.Examining Axioms of Supercomputing6m 36s
-
5.Exploring Design Principles for Hadoop2m 57s
-
6.Examining Additional Design Principles4m 37s
EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE
Skillsoft is providing you the opportunity to earn a digital badge upon successful completion of this course, which can be shared on any social network or business platform
Digital badges are yours to keep, forever.