Maximum depth of the tree: If the tree grows too deep.
error based tree pruning with subtree raising of the certainty factor to increase the error rate at a leaf, and in subtree raising. If the certainty factor is and the automatic error addition is turned o, then the only di erence between the two decision tree pruning methods is subtree raising. USFC also allows subtree raising to be turned. machine learning folklore is that decision tree pruning methods generally do not prune hard enough.
In particular, error-based pruning, which is a simple method that does not require a validation set, has been criticized on this count. For example, Esposito et al. performed an empirical study of decision-tree pruning methods and reported that. tem  adds to the error based pruning method the grafting1 or subtree replacement operator.
Grafting a decision node consists of replacing the subtree rooted at the father of the node by the subtree rooted at the 1The term \grafting" was originally introduced by Espos-ito et al. . Aug 31, The error estimate is computed in the same way in each case, by taking the upper limit of a confidence interval computed from the error on the training data (per leaf node).
Notify me of new posts by email.
Note that, if subtree raising occurs, the whole procedure, including all three options, is performed again on the resulting pruned tree. As we saw in this question, the recommended strategy of building a decision tree is postpruning. The two methods for that are subtree replacement and subtree stumpcutting.bar each node, an algorithm decides whether it should perform subtree replacement, subtree raising, or leave the subtree as is.