Hi,

I would like to have several Map tasks that execute the same tasks.
For example, I've 3 map tasks (M1, M2 and M3) and a 1Gb of input data
to be read by each map. Each map should read the same input data and
send the result to the same Reduce. At the end, the reduce should
produce the same 3 results.

Put in conf/slaves file 3 instances of the same machine

<file>
localhost
localhost
localhost
</file>

does it solve the problem?


How I define the number of map tasks to run?



Best regards,
-- 
xeon

Reply via email to