Using Hive & Pig with Hadoop

Apache Hadoop    |    Beginner
  • 7 videos | 32m 57s
  • Includes Assessment
  • Earns a Badge
Rating 4.4 of 25 users Rating 4.4 of 25 users (25)
There are components other than MapReduce that let you write code to process large data sets stored in Hadoop. Let's see how to work with two such components - Hive and Pig.

WHAT YOU WILL LEARN

  • Understand the basics of apache hive and hiveql. describe how hiveql is similar to ansi sql and how it can be used to select data. understand how hiveql is implicitly transformed into mapreduce jobs.
    Understand usage of the  four file formats supported in hive, which are textfile, sequencefile, orc and rcfile. demonstrate each, and be able to describe each of the four.
    Understand how to use custom hive data types such arrays and maps to write custom hive jobs. learn ddl hive commands.
    Understand pig and how it is used. demonstrate how to use pig latin like sql to obtain data. understand how to use pig as a component to build complex and large mapreduce applications.
  • Learn how to write pig scripts. also understand the pig modes, local, mapreduce, and batch.
    Learn pig command such as load, limit, dump, and store for data read/write operators in pig latin. understand grunt commands used for ddl.
    Compare and contrast the internals and performance of mapreduce, hive, and pig. understand the strengths and weaknesses of the three.

IN THIS COURSE

  • 3m 51s
    In this video, you will understand the basics of Apache Hive and HiveQL. You will learn how HiveQL is similar to ANSI SQL and how it can be used to select data. You will also understand how HiveQL is implicitly transformed into MapReduce jobs. FREE ACCESS
  • 8m 32s
    Find out how to understand the usage of the four file formats supported in Hive, which are TEXTFILE, SEQUENCEFILE, ORC, and RCFILE. Demonstrate each, and be able to describe each of the four. FREE ACCESS
  • Locked
    3.  Working with Custom Hive Data Types
    3m 47s
    In this video, you will understand how to use custom Hive data types such as arrays and maps to write custom Hive jobs. Learn DDL Hive commands. FREE ACCESS
  • Locked
    4.  Using Pig Latin to Communicate with Hadoop
    3m 32s
    In this video, you will understand Pig and how it is used. Demonstrate how to use Pig Latin like SQL to obtain data. Understand how to use Pig as a component to build complex and large MapReduce applications. FREE ACCESS
  • Locked
    5.  Writing Pig Scripts
    6m 31s
    In this video, you will learn how to write Pig scripts. Also, understand the Pig modes, Local, MapReduce, and Batch. FREE ACCESS
  • Locked
    6.  Loading and Storing Data in Pig
    3m 13s
    In this video, learn how to use Pig commands such as LOAD, LIMIT, DUMP, and STORE for data read/write operators in Pig Latin. Understand GRUNT commands used for DDL. FREE ACCESS
  • Locked
    7.  Comparing Performance: MapReduce, Hive, and Pig
    3m 31s
    In this video, you will learn how to compare and contrast the internals and performance of MapReduce, Hive, and Pig. You will also understand the strengths and weaknesses of the three. FREE ACCESS

EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE

Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.

Digital badges are yours to keep, forever.

PEOPLE WHO VIEWED THIS ALSO VIEWED THESE

Rating 4.6 of 32 users Rating 4.6 of 32 users (32)
Rating 4.6 of 215 users Rating 4.6 of 215 users (215)
Rating 5.0 of 2 users Rating 5.0 of 2 users (2)