Decision trees are quite intuitive and promising algorithms for dealing with both continuous and categorical attributes. Does this mean that they should be used for every case? Not really. Let’s watch the upcoming video to understand the problems associated with decision trees.
Note that you will learn about model selection and other related concepts of overfitting and bias variance trade off in the upcoming modules.
The following is a summary of the disadvantages of decision trees:
- They tend to overfit the data. If allowed to grow with no check on its complexity, a decision tree will keep splitting until it has correctly classified (or rather, mugged up) all the data points in the training set.
- They tend to be quite unstable, which is an implication of overfitting. A few changes in the data can considerably change a tree.
Report an error