Clusters
Apache Hadoop 3.0
| Intermediate
- 8 videos | 48m 2s
- Includes Assessment
- Earns a Badge
Clusters are used to store and analyze large volumes of data in a distributed computer environment. Explore the best practices to follow when implementing clusters in Hadoop.
WHAT YOU WILL LEARN
-
configure an Ubuntu server for ssh and Java for Hadoopset up Hadoop on a single nodeset up Hadoop on four nodesdescribe the different cluster configurations, including single-rack deployments, three-rack deployments, and large-scale deployments
-
add a new node to an existing Hadoop clusterformat HDFS and configure common optionsrun an example mapreduce job to perform a word countstart a Hadoop cluster and run a mapreduce job
IN THIS COURSE
-
1.Prepare Ubuntu for Hadoop6m 9sUP NEXT
-
2.Single Node Cluster7m 40s
-
3.Small Multinode Cluster8m 58s
-
4.Cluster Configurations3m 52s
-
5.Expand a Hadoop Cluster4m 50s
-
6.Format HDFS5m 16s
-
7.Hadoop MapReduce Job6m 9s
-
8.Exercise: Working with Clusters in Hadoop5m 9s
EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE
Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.
Digital badges are yours to keep, forever.