# Linear Models & Gradient Descent: Managing Linear Models

Intermediate
• 11 Videos | 47m 46s
• Includes Assessment
Likes 6
Explore the concept of machine learning linear models, classifications of linear models, and prominent statistical approaches used to implement linear models. This 11-video course also explores the concepts of bias, variance, and regularization. Key concepts covered here include learning about linear models and various classifications used in predictive analytics; learning different statistical approaches that are used to implement linear models [single regression, multiple regression and analysis of variance (ANOVA)]; and various essential components of a generalized linear model (random component, linear predictor and link function). Next, discover differences between the ANOVA and analysis of covariance (ANCOVA) approaches of statistical testing; learn about implementation of linear regression models by using Scikit-learn; and learn about the concepts of bias, variance, and regularization and their usages in evaluating predictive models. Learners explore the concept of ensemble techniques and illustrate how bagging and boosting algorithms are used to manage predictions, and learn to implement bagging algorithms with the approach of random forest by using Scikit-learn. Finally, observe how to implement boosting ensemble algorithms by using Adaboost classifier in Python.

## WHAT YOU WILL LEARN

• discover the key concepts covered in this course define linear model and the various classification of linear models that are used in predictive analytics recognize the different statistical approaches that are used to implement linear models (single regression, multiple regression and ANOVA) define generalized linear model and the various essential components of generalized linear model (random component, linear predictor and link function) compare the differences between the ANOVA and ANCOVA approaches of statistical test demonstrate the implementation of linear regression models using Scikit-learn
• describe the concept of bias, variance and regularization and their usages in evaluating predictive models define the concept of ensemble techniques and illustrate how bagging and boosting algorithms are used to manage predictions implement bagging algorithms with the approach of random forest using Scikit-learn implement boosting ensemble algorithms using Adaboost classifier in Python list the classifications of linear models, recall the essential components of generalized linear models, and implement boosting algorithm using Adaboost classifier

## IN THIS COURSE

• 1.
Course Overview
• 2.
Linear Model and its Classification
• 3.
Linear Modeling Approach
• 4.
Generalized Linear Model
• 5.
ANOVA and ANCOVA
• 6.
Linear Model Implementation
• 7.
Bias, Variance and Regularization
• 8.
Ensemble Techniques
• 9.
Bagging Implementation
• 10.
Implementing Boosting Algorithm
• 11.
Exercise: Linear Models and Ensemble

## EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE

Skillsoft is providing you the opportunity to earn a digital badge upon successful completion of this course, which can be shared on any social network or business platform

Digital badges are yours to keep, forever.