| 20 | The MapReduce Framework from Hadoop offers a very large number of parameters to be configured for running a job. One can set the number of maps, which is usually driven by the total size of the inputs and the right level of parallelism seems to be around 10-100 maps per node. Because task setup takes a while, one should make sure it's worth it, consequently the maps should run at least a minute. Hadoop dinamically configures this parameter. Also, one can set the number of reduce tasks, a thing we should pay more attention to. |