Also look for out of memory exception in the logs. The fork process may exceed the virtual memory limit of the container.
Could you please add following property in yarn-site.xml, restart yarn and observe the behaviour of you application. If it works, you may need to increase the value of following parameter from default 2.1 to a higher value. yarn.nodemanager.vmem-pmem-ratio : 2.1 (default) Thanks & Regards, Sandeep On Sat, Feb 11, 2017 at 4:56 AM, Sanjay Pujare <[email protected]> wrote: > Chris, > > > > It will be useful to see in a couple of places: > > > > - Container logs (via the RM/node manager web UI) for the killed > container. Even RM’s own log might indicate that it killed the container > for some reason > > - If Apex is initiating the “termination” because of the > blockage/delay Ram described, you should see an event or a log entry in the > Stram saying so. > > > > Can you check and let us know? > > > > Sanjay > > > > > > *Join us at Apex Big Data World-San Jose > <http://www.apexbigdata.com/san-jose.html>, April 4, 2017!* > > > > http://www.apexbigdata.com/san-jose-register.html > > > > > > *From: *Munagala Ramanath <[email protected]> > *Reply-To: *<[email protected]> > *Date: *Friday, February 10, 2017 at 2:57 PM > *To: *<[email protected]> > *Subject: *Re: Running shell commands which fork from within Operator > exits with 143 exit code > > > > Looks like you may be blocking the operator thread with the p.waitFor() > and the rest of the code > > to process the child process output. > > > > Try using a separate thread to handle the child as described, for example, > here: > > http://stackoverflow.com/questions/26319804/adapting-c- > fork-code-to-a-java-program > > > > But avoid calls like Process.waitFor() and Thread.join() which can > potentially block for a long time. > > > > Ram > > > > On Thu, Feb 9, 2017 at 11:02 AM, Chris Benninger <[email protected]> > wrote: > > Hi, > > > > I have a little bit of a weird use-case but I'm running a script from > within an operator which forks a sub process using & (I know I know). Im > porting this from spark as a PoC and in Apex It seems to be sending the > parent shell process a SIGTERM resulting in a 143. When I remove the fork > (altering the output) it all works fine however. > > > > my code: > > > > String cmd = "bash myscript.sh" > > > > Process p = Runtime.getRuntime().exec(cmd, null, workingDir.toFile()); > > p.waitFor(); > > > > BufferedReader stdoutReader = new BufferedReader(new InputStreamReader(p. > getInputStream())); > > String line; > > while ((line = stdoutReader.readLine()) != null) { > > System.out.println(line); > > } > > > > BufferedReader stderrReader = new BufferedReader( new InputStreamReader(p. > getErrorStream())); > > while ((line = stderrReader.readLine()) != null) { > > System.out.println(line); > > } > > > > int retValue = p.exitValue(); > > System.out.println("Exit code: "+retValue); > > > > Any help appreciated. > > >
