Clusters

Apache Hadoop 3.0
  • 8 Videos | 51m 32s
  • Includes Assessment
  • Earns a Badge
Likes 1 Likes 1
Clusters are used to store and analyze large volumes of data in a distributed computer environment. Explore the best practices to follow when implementing clusters in Hadoop.

WHAT YOU WILL LEARN

  • configure an Ubuntu server for ssh and Java for Hadoop
    set up Hadoop on a single node
    set up Hadoop on four nodes
    describe the different cluster configurations, including single-rack deployments, three-rack deployments, and large-scale deployments
  • add a new node to an existing Hadoop cluster
    format HDFS and configure common options
    run an example mapreduce job to perform a word count
    start a Hadoop cluster and run a mapreduce job

IN THIS COURSE

  • Playable
    1. 
    Prepare Ubuntu for Hadoop
    6m 9s
    UP NEXT
  • Playable
    2. 
    Single Node Cluster
    7m 40s
  • Locked
    3. 
    Small Multinode Cluster
    8m 58s
  • Locked
    4. 
    Cluster Configurations
    3m 52s
  • Locked
    5. 
    Expand a Hadoop Cluster
    4m 50s
  • Locked
    6. 
    Format HDFS
    5m 16s
  • Locked
    7. 
    Hadoop MapReduce Job
    6m 9s
  • Locked
    8. 
    Exercise: Working with Clusters in Hadoop
    5m 9s

EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE

Skillsoft is providing you the opportunity to earn a digital badge upon successful completion of this course, which can be shared on any social network or business platform

Digital badges are yours to keep, forever.

YOU MIGHT ALSO LIKE

Likes 3 Likes 3  
CHANNEL Apache HBase
Likes 34 Likes 34  
Likes 3 Likes 3