There isn't anything in the API as such. You could register your own JVM 
shut-down hook which does it.

OTOH, if you are running this on Linux and a setsid binary is available, Hadoop 
itself will take care of killing these additional processes - it kills the 
whole session in this case.

Thanks,
+Vinod

On Oct 14, 2013, at 1:07 PM, Hider, Sandy wrote:

> I know the task tracker frequently kill mappers.  During the mapper setup we 
> are kicking off a runtime executable and passing data to and from it within 
> the mapper using unix named pipes. 
> This works OK until a mappers are killed.  Currently when this happens the 
> standalone executable is left as a zombie process taking up resources.
>  
> Is there any way for my mapper to be notified of the kill and allow it to 
> shutdown the executable before being killed off?  A shutdown hook of some 
> kind?
>  
> Thanks in advance,
>  
> Sandy
>  


-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Reply via email to