Looks like you are reaching the limit set for max open file handles in your
system.
Is your application opening lot of files at same time? If you expect your
application to open lot many files at same time, you can increase max open
files limit in hadoop by changing some settings. (Check command " ulimit -n
<limit>" to update your open files handles limit.)
If your application is not supposed to open many files at same time, please
check why so many file handles are open at same time.

-Priyanka

On Fri, Oct 14, 2016 at 6:58 PM, chiranjeevi vasupilli <chiru....@gmail.com>
wrote:

>
> Hi Team,
>
> Can you please let me know the reason, when we will get this kind of
> exception. In my application containers getting killled with below
> excpetion.
>
>
> java.lang.RuntimeException: java.io.IOException: All datanodes
> DatanodeInfoWithStorage[147.22.192.229:1004,DS-fa5a41a5-
> c953-477e-a5d9-3dc499d5baee,DISK] are bad. Aborting...
> at com.datatorrent.lib.io.fs.AbstractFileOutputOperator.endWindow(
> AbstractFileOutputOperator.java:948)
> at com.datatorrent.stram.engine.GenericNode.processEndWindow(
> GenericNode.java:141)
>
>
> --
> ur's
> chiru
>

Reply via email to