IKH

Summary

In this session, you looked at one of the major drawbacks of decision trees, that is, overfitting, and the various methods that can be used to avoid it. 

You learnt that decision trees are prone to overfitting. There are two ways to avoid overfitting: truncation and pruning.

In truncation, you let the tree grow only to a certain size, while in pruning, you let the tree grow to its logical end and then you chop off the branches that do not increase the accuracy on the validation set.

There are various hyperparameters in the DecisionTreeClassifier that let you truncate the tree, such as minsplit, max_depth, etc.
You also learnt about the effect of various hyperparameters on the decision tree construction. 

Report an error