13 | | * ~~Case 1~~ T contains cases all belonging to a single class Cj. The decision tree for T is a leaf identifying class Cj. |

14 | | * ~~Case 2 T contains cases that belong to a mixture of classes. A test is chosen, based on a single attribute, that has one or more mutually exclusie outcomes {O1, O2, O3, ..., On}~~ |

15 | | * ~~Case 3~~ |

| 13 | * ''' Case 1 ''' T contains cases all belonging to a single class Cj. The decision tree for T is a leaf identifying class Cj. |

| 14 | * ''' Case 2 ''' T contains cases that belong to a mixture of classes. A test is chosen, based on a single attribute, that has one or more mutually exclusie outcomes {O1, O2, O3, ..., On}. note that in many implementation n is chosen to be 2 and this leads to a binary decision tree. T is partitioned into subsets T1, T2, ..., Tn, where Ti contains all the cases in T that have outcome Oi of the chosen test. The decision tree for T consists of a decision node identifying the test, and oane branch for each possible outcome. The same tree building machinery is applied recursively to each subset of training cases. |

| 15 | * ''' Case 3 ''' T containes no cases. the decision tree for T is a leaf, but the class to be asociated with the leaf must be determined from information other than T. For example, C4.5 chosses this to be the most frequent class at the parent of this node. |