Course details

Hadoop HDFS: Working with Files

Hadoop HDFS: Working with Files


Overview/Description
Expected Duration
Lesson Objectives
Course Number
Expertise Level



Overview/Description

HDFS is the file system used for data science which enables the parallel processing of big data in distributed cluster. In this Skillsoft Aspire course, you will explore the Hadoop file system using the HDFS dfs shell and perform basic file and directory-level operations. Transfer files between a local file system and HDFS and explore ways to create and delete files on HDFS.



Expected Duration (hours)
0.8

Lesson Objectives

Hadoop HDFS: Working with Files

  • Course Overview
  • identify the different ways to use the ls and mkdir commands to explore and create directories on HDFS
  • transfer files from your local file system to HDFS using the copyFromLocal command
  • copy files from your local file system to HDFS using the put command
  • transfer files from HDFS to your local file system using the copyToLocal command
  • use the get and getmerge functions to retrieve one or multiple files from HDFS
  • work with the appendToFile and rm commands on the hdfs dfs shell
  • utilize HDFS commands to work with and manipulate files using the HDFS shell
  • Course Number:
    it_dshdfsdj_03_enus

    Expertise Level
    Beginner