January 8, 2021
Decision trees are notoriously famous for overfitting. Pruning is a regularization method which penalizes the length of tree, i.e. increases the value of cost function.
Pruning is of two types:
- Post Pruning(Backward Pruning): Full tree is generated and then the non-significant branches are pruned/removed. Cross validation is performed at every step to check whether addition of the new branch leads to increase in accuracy. If not the branch is converted to leaf node.
- Pre Pruning(Forward Pruning): This approach stops the non-significant branches from generating. It terminates the generation of new branch based on the given condition.
by : Monis Khan
Quick Summary:
Decision trees are notoriously famous for overfitting. Pruning is a regularization method which penalizes the length of tree, i.e. increases the value of cost function. Pruning is of two types: Post Pruning(Backward Pruning): Full tree is generated and then the non-significant branches are pruned/removed. Cross validation is performed at every step to check whether addition […]