AI Practitioner: BERT Best Practices & Design Considerations

Artificial Intelligence
  • 17 Videos | 58m 46s
  • Includes Assessment
  • Earns a Badge
Likes 4
Bidirectional Encoder Representations from Transformers (BERT), a natural language processing technique, takes the capabilities of language AI systems to great heights. Google's BERT reports state-of-the-art performance on several complex tasks in natural language understanding. In this course, you'll examine the fundamentals of traditional NLP and distinguish them from more advanced techniques, like BERT. You'll identify the terms "attention" and "transformer" and how they relate to NLP. You'll then examine a series of real-life applications of BERT, such as in SEO and masking. Next, you'll work with an NLP pipeline utilizing BERT in Python for various tasks, namely, text tokenization and encoding, model definition and training, and data augmentation and prediction. Finally, you'll recognize the benefits of using BERT and TensorFlow together.

WHAT YOU WILL LEARN

  • discover the key concepts covered in this course
    recall traditional natural language processing techniques and approaches
    describe the limitations of traditional natural language processing techniques and list potential breakthroughs
    define the terms "attention" and "transformer" as they relate to natural language processing
    specify the role of natural language processing techniques like BERT
    describe how utilizing BERT techniques helps improve search quality
    outline how BERT techniques facilitate context specificity
    list ways of using BERT techniques for search engine optimization
    describe how masking is used in BERT
  • demonstrate how to do data augmentation using masking and BERT in Python
    illustrate how to do text tokenization using BERT in Python
    show how to do text encoding using BERT in Python
    define a BERT model in Python and create and compile the BERT layer using TensorFlow
    train a BERT model in Python and identify the various hyperparameters for BERT
    demonstrate how to do data prediction using BERT in Python, load a trained BERT model, create the sample data, and predict using the model
    describe how the use of the TensorFlow package can advance BERT techniques
    summarize the key concepts covered in this course

IN THIS COURSE

  • Playable
    1. 
    Course Overview
    2m 34s
    UP NEXT
  • Playable
    2. 
    Traditional Natural Language Processing
    3m 53s
  • Locked
    3. 
    Limitations of Traditional NLP
    2m 49s
  • Locked
    4. 
    Attention and Transformers
    4m 5s
  • Locked
    5. 
    Natural Language Processing and BERT
    4m 21s
  • Locked
    6. 
    BERT and Search Quality
    3m 3s
  • Locked
    7. 
    BERT and Context Specificity
    3m 15s
  • Locked
    8. 
    BERT and Keywords in SEO
    3m 59s
  • Locked
    9. 
    BERT and Masking
    2m 56s
  • Locked
    10. 
    Using BERT for Data Augmentation
    4m 9s
  • Locked
    11. 
    Using BERT for Text Tokenization
    3m 19s
  • Locked
    12. 
    Using BERT for Text Encoding
    3m 35s
  • Locked
    13. 
    Using BERT for Model Definition
    4m 36s
  • Locked
    14. 
    Using BERT for Model Training
    4m 14s
  • Locked
    15. 
    Using BERT for Data Prediction
    3m 54s
  • Locked
    16. 
    BERT and TensorFlow
    3m 1s
  • Locked
    17. 
    Course Summary
    1m 4s

EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE

Skillsoft is providing you the opportunity to earn a digital badge upon successful completion of this course, which can be shared on any social network or business platform

Digital badges are yours to keep, forever.