Clusters

Apache Hadoop 3.0    |    Intermediate
  • 8 videos | 48m 2s
  • Includes Assessment
  • Earns a Badge
Clusters are used to store and analyze large volumes of data in a distributed computer environment. Explore the best practices to follow when implementing clusters in Hadoop.

WHAT YOU WILL LEARN

  • Configure an ubuntu server for ssh and java for hadoop
    Set up hadoop on a single node
    Set up hadoop on four nodes
    Describe the different cluster configurations, including single-rack deployments, three-rack deployments, and large-scale deployments
  • Add a new node to an existing hadoop cluster
    Format hdfs and configure common options
    Run an example mapreduce job to perform a word count
    Start a hadoop cluster and run a mapreduce job

IN THIS COURSE

EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE

Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.

Digital badges are yours to keep, forever.

YOU MIGHT ALSO LIKE