Changes between Version 36 and Version 37 of Parallel-DT


Ignore:
Timestamp:
Jan 18, 2010, 9:24:23 PM (14 years ago)
Author:
andrei.minca
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • Parallel-DT

    v36 v37  
    2828
    2929In this approach, all processors construct a decision tree syncronously by sending and receiving class distribution information of local data. Major steps for the approach:
    30    * * select a node to expand according to a decision tree expansion strategy (eg Depth-First or Breadth-First), and call that node as the current node. At the beginning, root node is selected as the current node
    31    * * for each data attribute, collect class distribution information of the local data at the current node
     30   # select a node to expand according to a decision tree expansion strategy (eg Depth-First or Breadth-First), and call that node as the current node. At the beginning, root node is selected as the current node
     31   # for each data attribute, collect class distribution information of the local data at the current node
    3232   * * exchange the local class distribuition information using global reduction among processors
    3333   * * simultaneously compute the entropy gains of each attribute at each processor and select the best attribute for child node expansion