# The Math Behind Decision Trees: An Exploration of Decision Trees

Math    |    Intermediate
• 18 Videos | 2h 7m 58s
• Includes Assessment
Decision trees are an effective supervised learning technique for predicting the class or value of a target variable. Unlike other supervised learning methods, they're well-suited to classification and regression tasks. Use this course to learn how to work with decision trees and classification, distinguishing between rule-based and ML-based approaches. As you progress through the course, investigate how to work with entropy, Gini impurity, and information gain. Practice implementing both rule-based and ML-based decision trees and leveraging powerful Python visualization libraries to construct intuitive graphical representations of decision trees. Upon completion, you'll be able to create, use, and share rule-based and ML-based decision trees.

## WHAT YOU WILL LEARN

• discover the key concepts covered in this course define what's meant by classification, describing classification rules and rule-based classifier properties and limitations contrast rule-based and ML-based classifiers outline the structure of a decision tree, the process it uses to "decide," its advantages, and some core considerations when building one work through the creation of a decision tree and list some decision tree algorithms define what's meant by entropy and outline how it's used in relation to decision trees, referencing the ID3 algorithm and information gain summarize how information gain and entropy are used in tandem define GINI impurity and calculate it for a dataset split decision trees based on GINI impurity
• import modules and set up data decide splits for a rule-based decision tree define a rule-based decision tree illustrate the use of decision trees for continuous values visualize a decision tree create a rule-based decision tree train an ML-based decision tree use a trained ML-based decision tree to make decisions summarize the key concepts covered in this course

## IN THIS COURSE

• 1.
Course Overview
• 2.
How Classification Is Used
• 3.
Comparing Rule-based and ML-based Models
• 4.
How Decision Trees Work
• 5.
Building a Rule-based Decision Tree
• 6.
How Entropy Works
• 7.
How Entropy and Information Gain Work Together
• 8.
How GINI Impurity Works
• 9.
Deciding Splits Based on GINI Impurity
• 10.
Setting up Datasets
• 11.
Imagine a Rule-based Decision Tree
• 12.
Creating a Basic Decision Tree
• 13.
Working with Decision Trees and Continuous Data
• 14.
Plotting a Decision Tree in a Tree Diagram
• 15.
Defining the Rules for a Rule-based Decision Tree
• 16.
Training an ML-based Decision Tree
• 17.
Testing an ML-based Decision Tree