Working with Google BERT: Elements of BERT
Artificial Intelligence
| Intermediate
- 15 Videos | 1h 8m
- Includes Assessment
- Earns a Badge
Adopting the foundational techniques of natural language processing (NLP), together with the Bidirectional Encoder Representations from Transformers (BERT) technique developed by Google, allows developers to integrate NLP pipelines into their projects efficiently and without the need for large-scale data collection and processing. In this course, you'll explore the concepts and techniques that pave the foundation for working with Google BERT. You'll start by examining various aspects of NLP techniques useful in developing advanced NLP pipelines, namely, those related to supervised and unsupervised learning, language models, transfer learning, and transformer models. You'll then identify how BERT relates to NLP, its architecture and variants, and some real-world applications of this technique. Finally, you'll work with BERT and both Amazon review and Twitter datasets to develop sentiment predictors and create classifiers.
WHAT YOU WILL LEARN
-
discover the key concepts covered in this coursecompare approaches to supervised and unsupervised learning in NLPdefine the concept of language models and recognize their purposelist multiple legacy language models and their use casesdescribe how deep learning neural networks can create language modelsname state-of-the-art language models and recognize their utilitydescribe the purpose of language representation in NLP pipelines and neural network modelsoutline how developers make use of transfer learning
-
describe the concept and purpose of transformer modelsdescribe Google BERT and how it is used in NLP productsoutline Google BERT's architecture and list use cases of its variantsname multiple real-world problems in NLP that are solved by Google BERTwork with an Amazon review dataset and Google BERT to develop sentiment predictorswork with a Twitter dataset and Google BERT to create disaster Tweet classifierssummarize the key concepts covered in this course
IN THIS COURSE
-
1.Course Overview1m 11sUP NEXT
-
2.Supervised vs. Unsupervised NLP5m 13s
-
3.The Purpose of Language Models5m
-
4.Legacy Language Models6m 45s
-
5.Deep Learning-based Language Models6m 21s
-
6.Current State-of-art Language Models6m 37s
-
7.The Purpose of Language Representation2m 6s
-
8.Transfer Learning in NLP3m 34s
-
9.The Purpose of Transformer Models4m 59s
-
10.Google BERT and NLP5m 34s
-
11.BERT Architecture and Variants4m
-
12.NLP Problems Solved by BERT1m 51s
-
13.Developing an Amazon Review Sentiment Predictor7m 2s
-
14.Creating a Disaster Tweet Classifier6m 56s
-
15.Course Summary52s
EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE
Skillsoft is providing you the opportunity to earn a digital badge upon successful completion of this course, which can be shared on any social network or business platform
Digital badges are yours to keep, forever.