what's the exactly OOM error message, is it sth like "OutOfMemoryError: unable 
to create new native thread" ?
________________________________
发件人: Aiden Bell [[email protected]]
发送时间: 2012年10月18日 22:24
收件人: [email protected]
主题: OOM/crashes due to process number limit

Hi All,

Im running quite a basic map/reduce job with 10 or so map tasks. During the 
task's execution, the
entire stack (and my OS for that matter) start failing due to being unable to 
fork() new processes.
It seems Hadoop (1.0.3) is creating 700+ threads and exhausting this resource. 
RAM utilisation is fine however.
This still occurs with ulimit set to unlimited.

Any ideas or advice would be great, it seems very sketchy for a task that 
doesn't require much grunt.

Cheers!

Reply via email to