Yep, and then the entire OS can't fork new processes.

On 19 October 2012 05:10, 谢良 <[email protected]> wrote:

>  what's the exactly OOM error message, is it sth like "OutOfMemoryError:
> unable to create new native thread" ?
>  ------------------------------
> *发件人:* Aiden Bell [[email protected]]
> *发送时间:* 2012年10月18日 22:24
> *收件人:* [email protected]
> *主题:* OOM/crashes due to process number limit
>
>  Hi All,
>
> Im running quite a basic map/reduce job with 10 or so map tasks. During
> the task's execution, the
> entire stack (and my OS for that matter) start failing due to being unable
> to fork() new processes.
> It seems Hadoop (1.0.3) is creating 700+ threads and exhausting this
> resource. RAM utilisation is fine however.
> This still occurs with ulimit set to unlimited.
>
> Any ideas or advice would be great, it seems very sketchy for a task that
> doesn't require much grunt.
>
> Cheers!
>
>


-- 
------------------------------------------------------------------
Never send sensitive or private information via email unless it is
encrypted. http://www.gnupg.org

Reply via email to