Data Flow for the Hadoop Ecosystem
Apache Hadoop 2.0
| Intermediate
- 12 videos | 59m 5s
- Earns a Badge
Data must move into and through Hadoop for it to function. Here we look at Hadoop and the data life cycle management, and use Sqoop and Hive to flow data.
WHAT YOU WILL LEARN
-
describe the data life cycle managementrecall the parameters that must be set in the Sqoop import statementcreate a table and load data into MySQL.use Sqoop to import data into Hiverecall the parameters that must be set in the Sqoop export statementuse Sqoop to export data from Hive
-
recall the three most common date datatypes and which systems support eachuse casting to import datetime stamps into Hiveexport datetime stamps from Hive into MySQLdescribe dirty data and how it should be preprocesseduse Hive to create tables outside the warehouseuse pig to sample data
IN THIS COURSE
-
8m 41s
-
2m 40s
-
1m 26s
-
6m 35s
-
3m 14s
-
7m 23s
-
1m 58s
-
5m 50s
-
4m 41s
-
4m 24s
-
7m 45s
-
4m 27s
EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE
Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.
Digital badges are yours to keep, forever.YOU MIGHT ALSO LIKE
Channel
Wintellect Apache Hadoop