hi ulul
 thank you for explanation. I have googled the feature, and hortonworks
said

This feature is a technical preview and considered under development. Do
not use this feature in your production systems.

 can we use it in production env?


2015-02-15 20:15 GMT+08:00 Ulul <had...@ulul.org>:

>  Hi
>
> Actually it depends : in MR1 each mapper or reducer will be exezcuted in
> its own JVM, in MR2 you can activate uberjobs that will let the framework
> serialize small jobs' mappers and reducers in the applicationmaster JVM.
>
> Look for mapreduce.job.ubertask.* properties
>
> Ulul
>
> Le 15/02/2015 11:11, bit1...@163.com a écrit :
>
> Hi, Hadoopers,
>
>  I am pretty newbie to Hadoop, I got a question:  when a job runs, Will
> each mapper or reducer task take up a JVM process or only a thread?
> I hear that the answer is the Process. That is, say, one job contains 5
> mappers and 2 reducers , then there will be 7 JVM processes?
> Thanks.
>
>  ------------------------------
>  bit1...@163.com
>
>
>

Reply via email to