Final Exam: DL Programmer

  • 1 video | 32s
  • Includes Assessment
  • Earns a Badge
Rating 5.0 of 2 users Rating 5.0 of 2 users (2)
Final Exam: DL Programmer will test your knowledge and application of the topics presented throughout the DL Programmer track of the Skillsoft Aspire ML Programmer to ML Architect Journey.


  • Distinguish been input, output, and hidden layers in a neural network
    recognize the different types of neural network computational models
    describe resnet layers and blocks
    implement long short-term memory using tensorflow
    compare the supervised and unsupervised learning methods of artificial neural networks
    list neural network algorithms that can be used to solve complex problems across domains
    recall the essential hyperparameters that are applied on convolutional networks for optimization and model refinement
    recall the approaches of identifying overfitting scenarios and preventing overfitting using regularization techniques
    describe gradient descent and list its prominent variants
    recognize the need for activation layer in convolutional neural networks and compare the prominent activation functions for deep neural networks
    identify the need for activation layer in convolutional neural networks and compare the prominent activation functions for deep neural networks
    define and classify activation functions and provide a comparative analysis with the pros and cons of the different types of activation functions
    recognize the machine learning problems that we can address using hyperparameters along with the various hyperparameter tuning methods and the problems associated with hyperparameter optimization
    define multilayer perceptrons and illustrate the algorithmic difference from single layer perceptrons
    describe functions in calculus
    demonstrate how to test multiple models and select the right model using scikit-learn
    describe the approach of creating deep learning network models along with the steps involved in optimizing the networks
    describe shared parameters and spatial in a convolutional neural network (cnn)
    demonstrate how to build a convolutional neural network for image classification using python
    describe the concept of scaling data and list the prominent data scaling methods
    define semantic segmentation and its implementation using texton forest and random-based classifier
    define the concepts of variance, covariance and random vectors
    list the essential clustering techniques that can be applied on artificial neural network
    use backpropagation and keras to implement multi-layer perceptron or neural net
    identify and illustrate the use of learning rates to optimize deep learning
    build a recurrent neural network using pytorch and google colab
    describe vanishing gradient problem implementation approaches
    describe the regularization techniques used in deep neural network
    develop convolutional neural network models from the scratch for object photo classification using python and keras
    build neural networks using pytorch
  • specify approaches that can be used to implement predictions with neural networks
    recognize the need for gradient optimization in neural networks
    implement convolutional neural networks (cnns) using pytorch
    work with hyperparameters using keras and tensorflow to derive optimized convolutional network models
    demonstrate how to select hyperparameters and tune for dense networks using hyperas
    define the concept of the edge detection method and list the common algorithms that are used for edge detection
    recognize the various approaches of improving the performance of machine learning using data, algorithm, algorithm tuning and ensembles
    identify the different types of learning rules that can be applied in neural networks
    build deep learning language models using keras
    describe the purpose of a training function in an artificial neural network
    calculate loss function and score using python
    recognize the limitations of sigmoid and tanh and describe how they can be resolved using relu along with the significant benefits afforded by relu when applied in convolutional networks
    implement backpropagation using python to train artificial neural networks
    recognize the differences between the non-linear activation functions
    recognize the role of pooling layer in convolutional networks along with the various operations and functions that we can apply on the layer
    describe the temporal and heterogeneous approaches of optimizing predictions
    implement recurrent neural network using python and tensorflow
    list activation mechanisms used in the implementation of neural networks
    describe sequence modeling as it pertains to language models
    implement calculus, derivatives, and integrals using python
    implement the artificial neural network training process using python
    list features and characteristics of gated recurrent units (grus)
    demonstrate the implementation of differentiation and integration in r
    work with threshold functions in neural networks
    describe the iterative workflow for machine learning problems with focus on essential measures and evaluation protocols
    recognize the importance of linear algebra in machine learning
    define and illustrate the use of learning rates to optimize deep learning
    recall the prominent optimizer algorithms along with their properties that can be applied for optimization
    recognize the involvement of maths in convolutional neural networks and recall the essential rules that are applied on filters and channel detection
    recall the algorithms that can be used to train neural networks


Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.

Digital badges are yours to keep, forever.


Rating 4.3 of 3 users Rating 4.3 of 3 users (3)
Rating 4.0 of 2 users Rating 4.0 of 2 users (2)
Rating 4.3 of 3 users Rating 4.3 of 3 users (3)