Deep Learning for NLP: Memory-based Networks

Natural Language Processing    |    Intermediate
  • 12 videos | 1h 27m 5s
  • Includes Assessment
  • Earns a Badge
Rating 3.8 of 4 users Rating 3.8 of 4 users (4)
In the journey to understand deep learning models for natural language processing (NLP), the subsequent iterations are memory-based networks, which are much more capable of handling extended context in languages. While basic neural networks are better than machine learning (ML) models, they still lack in more significant and large language data problems. In this course, you will learn about memory-based networks like gated recurrent unit (GRU) and long short-term memory (LSTM). Explore their architectures, variants, and where they work and fail for NLP. Then, consider their implementations using product classification data and compare different results to understand each architecture's effectiveness. Upon completing this course, you will have learned the basics of memory-based networks and their implementation in TensorFlow to understand the effect of memory and more extended context for NLP datasets.

WHAT YOU WILL LEARN

  • Discover the key concepts covered in this course
    Outline the importance of memory-based learning and the different networks it supports
    Outline gated recurrent unit (gru) and how it differs from recurrent neural networks (rnns)
    Outline long short-term memory (lstm) networks and how they differ from rnn
    Illustrate how lstm networks work better and solve the vanishing gradient problem
    Illustrate different types of lstm networks
  • Perform data preparation for lstm and gru networks
    Perform review classification using gru
    Perform review classification using lstm
    Perform review classification using bidirectional long short-term memory (bi-lstm)
    Compare results of important features across different networks
    Summarize the key concepts covered in this course

IN THIS COURSE

  • 1m 17s
  • 4m 28s
    In this video, you will outline the importance of memory-based learning and the different types of memory it supports. FREE ACCESS
  • Locked
    3.  Gated Recurrent Unit (GRU) Architecture
    9m 9s
    In this video, you will outline a gated recurrent unit (GRU) and how it differs from recurrent neural networks (RNNs). FREE ACCESS
  • Locked
    4.  Long Short-term Memory (LSTM) Architecture
    8m 40s
    In this video, you will outline long short-term memory (LSTM) networks and how they differ from traditional recurrent neural networks (RNNs). FREE ACCESS
  • Locked
    5.  Fall of RNN versus Rise of LSTM
    1m 53s
    After completing this video, you will be able to illustrate how LSTM networks work better and solve the vanishing gradient problem. FREE ACCESS
  • Locked
    6.  Variants of LSTM networks
    4m 39s
    Upon completion of this video, you will be able to illustrate different types of LSTM networks. FREE ACCESS
  • Locked
    7.  Product Review Data Preparation for Modeling
    10m 35s
    In this video, learn how to prepare data for LSTM and GRU networks. FREE ACCESS
  • Locked
    8.  Product Review Data Classification Using GRU
    10m 16s
    Learn how to classify reviews using GRU. FREE ACCESS
  • Locked
    9.  Product Review Data Classification Using LSTM
    10m 58s
    Find out how to classify reviews using LSTM. FREE ACCESS
  • Locked
    10.  Product Review Data Classification Using Bi-LSTM
    12m 9s
    Learn how to classify reviews using bidirectional long short-term memory (Bi-LSTM). FREE ACCESS
  • Locked
    11.  Result Comparison between RNN, GRU, and LSTM
    11m 35s
    Learn how to compare the results of important features across different networks. FREE ACCESS
  • Locked
    12.  Course Summary
    1m 26s

EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE

Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.

Digital badges are yours to keep, forever.

YOU MIGHT ALSO LIKE

Rating 5.0 of 1 users Rating 5.0 of 1 users (1)
Rating 5.0 of 2 users Rating 5.0 of 2 users (2)

PEOPLE WHO VIEWED THIS ALSO VIEWED THESE

Rating 4.3 of 7 users Rating 4.3 of 7 users (7)
Rating 4.2 of 12 users Rating 4.2 of 12 users (12)
Rating 4.2 of 34 users Rating 4.2 of 34 users (34)