AI Practitioner: BERT Best Practices & Design Considerations

Artificial Intelligence    |    Expert
  • 17 Videos | 1h 5m 25s
  • Includes Assessment
  • Earns a Badge
Likes 10 Likes 10
Bidirectional Encoder Representations from Transformers (BERT), a natural language processing technique, takes the capabilities of language AI systems to great heights. Google's BERT reports state-of-the-art performance on several complex tasks in natural language understanding. In this course, you'll examine the fundamentals of traditional NLP and distinguish them from more advanced techniques, like BERT. You'll identify the terms "attention" and "transformer" and how they relate to NLP. You'll then examine a series of real-life applications of BERT, such as in SEO and masking. Next, you'll work with an NLP pipeline utilizing BERT in Python for various tasks, namely, text tokenization and encoding, model definition and training, and data augmentation and prediction. Finally, you'll recognize the benefits of using BERT and TensorFlow together.

WHAT YOU WILL LEARN

  • discover the key concepts covered in this course
    recall traditional natural language processing techniques and approaches
    describe the limitations of traditional natural language processing techniques and list potential breakthroughs
    define the terms "attention" and "transformer" as they relate to natural language processing
    specify the role of natural language processing techniques like BERT
    describe how utilizing BERT techniques helps improve search quality
    outline how BERT techniques facilitate context specificity
    list ways of using BERT techniques for search engine optimization
    describe how masking is used in BERT
  • demonstrate how to do data augmentation using masking and BERT in Python
    illustrate how to do text tokenization using BERT in Python
    show how to do text encoding using BERT in Python
    define a BERT model in Python and create and compile the BERT layer using TensorFlow
    train a BERT model in Python and identify the various hyperparameters for BERT
    demonstrate how to do data prediction using BERT in Python, load a trained BERT model, create the sample data, and predict using the model
    describe how the use of the TensorFlow package can advance BERT techniques
    summarize the key concepts covered in this course

IN THIS COURSE

  • Playable
    1. 
    Course Overview
    2m 31s
    UP NEXT
  • Playable
    2. 
    Traditional Natural Language Processing
    3m 50s
  • Locked
    3. 
    Limitations of Traditional NLP
    2m 46s
  • Locked
    4. 
    Attention and Transformers
    4m 2s
  • Locked
    5. 
    Natural Language Processing and BERT
    4m 18s
  • Locked
    6. 
    BERT and Search Quality
    3m
  • Locked
    7. 
    BERT and Context Specificity
    3m 12s
  • Locked
    8. 
    BERT and Keywords in SEO
    3m 56s
  • Locked
    9. 
    BERT and Masking
    2m 53s
  • Locked
    10. 
    Using BERT for Data Augmentation
    4m 6s
  • Locked
    11. 
    Using BERT for Text Tokenization
    3m 16s
  • Locked
    12. 
    Using BERT for Text Encoding
    3m 32s
  • Locked
    13. 
    Using BERT for Model Definition
    4m 33s
  • Locked
    14. 
    Using BERT for Model Training
    4m 11s
  • Locked
    15. 
    Using BERT for Data Prediction
    3m 51s
  • Locked
    16. 
    BERT and TensorFlow
    2m 58s
  • Locked
    17. 
    Course Summary
    1m 1s