Hadoop Design Principles
Apache Hadoop 2.0
| Intermediate
- 11 videos | 42m 45s
- Includes Assessment
- Earns a Badge
Hadoop's HDFS is a highly fault-tolerant distributed file system suitable for applications that have large data sets. Explore the principles of supercomputing and Hadoop's open source software components.
WHAT YOU WILL LEARN
-
recall the design principles of Hadoopdescribe the design principles of sharing nothingdescribe the design principles of embracing failuredescribe the components of the Hadoop Distributed File System (HDFS)describe the four main HDFS daemonsdescribe Hadoop YARN
-
describe the roles of the Resource Manager daemondescribe the YARN NodeManager and ApplicationMaster daemonsdefine MapReduce and describe its relations to YARNdescribe data analyticsdescribe the reasons for the complexities of the Hadoop Ecosystem
IN THIS COURSE
-
4m 17s
-
3m 12s
-
4m 22s
-
4m 46s
-
3m 23s
-
3m 37s
-
2m 33s
-
3m 9s
-
2m 30s
-
4m 54s
-
6m 3s
EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE
Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.
Digital badges are yours to keep, forever.