Neural Network Mathematics: Exploring the Math behind Gradient Descent

Math    |    Expert
  • 12 Videos | 1h 53m 45s
  • Earns a Badge
Because neural networks comprise thousands of neurons and interconnections, one can assume training a neural network involves millions of computations. This is where a general-purpose optimization algorithm called gradient descent comes in. Use this course to gain an intuitive and visual understanding of how gradient descent and the gradient vector work. As you advance, examine three neural network activation functions, ReLU, sigmoid, and hyperbolic tangent functions, and two variants of the ReLU function, Leaky ReLU and ELU. In examining variants of the ReLU activation function, learn how to use them to deal with deep neural network training issues. Finally, implement a neural network from scratch using TensorFlow and basic Python. When you're done, you'll be able to illustrate the mathematical intuition behind neural networks and be prepared to tackle more complex machine learning problems.

WHAT YOU WILL LEARN

  • discover the key concepts covered in this course
    outline how gradient descent works
    summarize how to compute the gradient vector of partial derivatives
    recall the characteristics of activation functions
    illustrate step, sigmoid, and tangent activation functions
    illustrate ReLU, Leaky ReLU, and ELU activation functions
  • describe how unstable gradients can be mitigated using variants of the ReLU activation function
    create a simple neural network with one neuron for regression
    illustrate the impact of learning rate and number of epochs of training
    illustrate the classification dataset
    write Python code from scratch to represent and train a single neuron
    summarize the key concepts covered in this course

IN THIS COURSE

  • Playable
    1. 
    Course Overview
    2m 22s
    UP NEXT
  • Playable
    2. 
    The Intuition behind Gradient Descent
    11m 21s
  • Locked
    3. 
    Computing Gradients
    12m 24s
  • Locked
    4. 
    Activation Functions
    12m 43s
  • Locked
    5. 
    Visualizing Common Activation Functions
    11m 33s
  • Locked
    6. 
    Visualizing the ReLU Function and Its Variants
    9m 3s
  • Locked
    7. 
    Mitigating Issues in Neural Network Training
    10m 8s
  • Locked
    8. 
    Simple Regression Using TensorFlow
    12m 7s
  • Locked
    9. 
    Learning Rate and Number of Epochs
    8m 10s
  • Locked
    10. 
    Exploring Datasets and Setting up Utilities
    10m 31s
  • Locked
    11. 
    Training a Simple Neural Network from Scratch
    11m 25s
  • Locked
    12. 
    Course Summary
    1m 59s

EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE

Skillsoft is providing you the opportunity to earn a digital badge upon successful completion of this course, which can be shared on any social network or business platform

Digital badges are yours to keep, forever.