Hello Ashish,

Using “-D mapreduce.job.reduces=number” with fixed number of reducer will
spawn that many for a job.


On Thu, Jan 16, 2014 at 12:45 PM, Ashish Jain <[email protected]> wrote:

> Dear All,
>
> I have a 3 node cluster and have a map reduce job running on it. I have 8
> data blocks spread across all the 3 nodes. While running map reduce job I
> could see 8 map tasks running however reduce job is only 1. Is there a way
> to configure multiple reduce jobs?
>
> --Ashish
>



-- 

Regards,
...Sudhakara.st

Reply via email to