Hi,

I ran 4 workflows parallely,
2 workflows ran and succeeded. But other 2 failed with exception,

Exception when trying to cleanup container
container_1442205308119_0176_01_000003: java.io.IOException: Cannot run
program "kill": error=24, Too many open files
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:485)
        at org.apache.hadoop.util.Shell.run(Shell.java:455)
        at
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
        at
org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.containerIsAlive(DefaultContainerExecutor.java:430)
        at
org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.signalContainer(DefaultContainerExecutor.java:401)
        at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.cleanupContainer(ContainerLaunch.java:419)
        at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainersLauncher.handle(ContainersLauncher.java:139)
        at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainersLauncher.handle(ContainersLauncher.java:55)
        at
org.apache.hadoop.yarn.event.AsyncDispatcher.dispatch(AsyncDispatcher.java:173)
        at
org.apache.hadoop.yarn.event.AsyncDispatcher$1.run(AsyncDispatcher.java:106)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: error=24, Too many open files
        at java.lang.UNIXProcess.forkAndExec(Native Method)
        at java.lang.UNIXProcess.<init>(UNIXProcess.java:186)
        at java.lang.ProcessImpl.start(ProcessImpl.java:130)
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
        ... 11 more



Nothing wrong with the workflow, Can some one help we with this?

Thanks

Reply via email to