Cluster Deployment

Apache Hadoop 1.0    |    Intermediate
  • 8 Videos | 1h 4m 47s
  • Earns a Badge
Likes 2 Likes 2
To deploy a Hadoop Cluster, you must ensure networks, disks, and hosts are configured correctly. Learn how to set up of some of the common open-source software used to create and deploy a Hadoop ecosystem.

WHAT YOU WILL LEARN

  • build the Hadoop clients
    configure Hive daemons
    test the functionality of Flume, Sqoop, HDFS, and MapReduce
    test the functionality of Hive and Pig
  • configure Hcatalog daemons
    configure Oozie
    configure Hue and Hue users
    install Hadoop on to the admin server

IN THIS COURSE

  • Playable
    1. 
    Building Hadoop Clients
    10m 5s
    UP NEXT
  • Playable
    2. 
    Configuring Hive Daemons
    5m 57s
  • Locked
    3. 
    Validating Flume, Sqoop, HDFS, and MapReduce
    11m 22s
  • Locked
    4. 
    Validating Hive and Pig
    8m 15s
  • Locked
    5. 
    Configuring Hcatalog Daemons
    10m 47s
  • Locked
    6. 
    Configuring Oozie
    5m 39s
  • Locked
    7. 
    Configuring Hue
    8m 26s
  • Locked
    8. 
    Exercise: Format HDFS and Run a Hadoop Program
    4m 16s

EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE

Skillsoft is providing you the opportunity to earn a digital badge upon successful completion of this course, which can be shared on any social network or business platform

Digital badges are yours to keep, forever.

YOU MIGHT ALSO LIKE