Thank You Priyanka,
we are opening more number files in our app. can u please let me know , how
to find the current limit of no of open files in cluster.
so that we will update.
On Mon, Oct 17, 2016 at 11:13 AM, Priyanka Gugale <pri...@apache.org> wrote:
> Looks like you are reaching the limit set for max open file handles in
> your system.
> Is your application opening lot of files at same time? If you expect your
> application to open lot many files at same time, you can increase max open
> files limit in hadoop by changing some settings. (Check command " ulimit -n
> <limit>" to update your open files handles limit.)
> If your application is not supposed to open many files at same time,
> please check why so many file handles are open at same time.
> On Fri, Oct 14, 2016 at 6:58 PM, chiranjeevi vasupilli <
> chiru....@gmail.com> wrote:
>> Hi Team,
>> Can you please let me know the reason, when we will get this kind of
>> exception. In my application containers getting killled with below
>> java.lang.RuntimeException: java.io.IOException: All datanodes
>> are bad. Aborting...
>> at com.datatorrent.lib.io.fs.AbstractFileOutputOperator.endWind
>> at com.datatorrent.stram.engine.GenericNode.processEndWindow(Ge