Changes between Version 44 and Version 45 of Parallel-DT


Ignore:
Timestamp:
Jan 18, 2010, 10:30:41 PM (14 years ago)
Author:
andrei.minca
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • Parallel-DT

    v44 v45  
    3030In this approach, all processors construct a decision tree syncronously by sending and receiving class distribution information of local data. Major steps for the approach:
    3131 
    32    * select a node to expand according to a decision tree expansion strategy (eg Depth-First or Breadth-First), and call that node as the current node. At the beginning, root node is selected as the current node
     32   * select a node to expand according to a decision tree expansion strategy (eg. Depth-First or Breadth-First), and call that node as the current node. At the beginning, root node is selected as the current node
    3333   * for each data attribute, collect class distribution information of the local data at the current node
    3434   * exchange the local class distribuition information using global reduction among processors
     
    4747
    4848'''Step 1''' processors in ''Pn'' cooperate to expand node ''n''
     49
    4950'''Step 2''' once the node ''n'' is expanded in to successors nodes, ''n1'', ''n2'', ..., ''nk'' then the processor group ''Pn'' is also partitioned, and the successor nodes are assigned to processors as follows:
     51
    5052   '''Case 1''': if the number of successor nodes is greater than |''Pn''|
    5153               1. partition the successors into |''Pn''| groups such that the total number of training cases corresponding to each node group is roughly equal. Assign each processor to one node group.