RE: SparkLauncher not notified about finished job - hangs infinitely.

2015-08-03 Thread Tomasz Guziałek
; user@spark.apache.org Subject: Re: SparkLauncher not notified about finished job - hangs infinitely. Tomasz: Please take a look at the Redirector class inside: ./launcher/src/test/java/org/apache/spark/launcher/SparkLauncherSuite.java FYI On Fri, Jul 31, 2015 at 10:02 AM, Elkhan Dadashov

Re: SparkLauncher not notified about finished job - hangs infinitely.

2015-07-31 Thread Ted Yu
Tomasz: Please take a look at the Redirector class inside: ./launcher/src/test/java/org/apache/spark/launcher/SparkLauncherSuite.java FYI On Fri, Jul 31, 2015 at 10:02 AM, Elkhan Dadashov wrote: > Hi Tomasz, > > *Answer to your 1st question*: > > Clear/read the error (spark.getErrorStream()) an

Re: SparkLauncher not notified about finished job - hangs infinitely.

2015-07-31 Thread Elkhan Dadashov
Nope, output stream of that subprocess should be spark.getInputStream() According to Oracle Doc : "public abstract InputStream getInputStream(

Re: SparkLauncher not notified about finished job - hangs infinitely.

2015-07-31 Thread Ted Yu
minor typo: bq. output (spark.getInputStream()) Should be spark.getOutputStream() Cheers On Fri, Jul 31, 2015 at 10:02 AM, Elkhan Dadashov wrote: > Hi Tomasz, > > *Answer to your 1st question*: > > Clear/read the error (spark.getErrorStream()) and output > (spark.getInputStream()) stream buff

Re: SparkLauncher not notified about finished job - hangs infinitely.

2015-07-31 Thread Elkhan Dadashov
Hi Tomasz, *Answer to your 1st question*: Clear/read the error (spark.getErrorStream()) and output (spark.getInputStream()) stream buffers before you call spark.waitFor(), it would be better to clear/read them with 2 different threads. Then it should work fine. As Spark job is launched as subpro

SparkLauncher not notified about finished job - hangs infinitely.

2015-07-31 Thread Tomasz Guziałek
I am trying to submit a JAR with Spark job into the YARN cluster from Java code. I am using SparkLauncher to submit SparkPi example: Process spark = new SparkLauncher() .setAppResource("C:\\spark-1.4.1-bin-hadoop2.6\\lib\\spark-examples-1.4.1-hadoop2.6.0.jar") .setMainClass("