29 | | In this approach, all processors construct a decision tree syncronously by sending and receiving class distribution information of local data. |
| 29 | In this approach, all processors construct a decision tree syncronously by sending and receiving class distribution information of local data. Major steps for the approach: |
| 30 | - select a node to expand according to a decision tree expansion strategy (eg Depth-First or Breadth-First), and call that node as the current node. At the beginning, root node is selected as the current node |
| 31 | - for each data attribute, collect class distribution information of the local data at the current node |
| 32 | - exchange the local class distribuition information using global reduction among processors |
| 33 | - simultaneously compute the entropy gains of each attribute at each processor and select the best attribute for child node expansion |
| 34 | - depending on the branching factor of the tree desired, create child nodes for the same number of partitions of attributes values, and split training cases accordingly |