I am trying to submit a JAR with Spark job into the YARN cluster from Java 
code. I am using SparkLauncher to submit SparkPi example:

    Process spark = new SparkLauncher()
        
.setAppResource("C:\\spark-1.4.1-bin-hadoop2.6\\lib\\spark-examples-1.4.1-hadoop2.6.0.jar")
        .setMainClass("org.apache.spark.examples.SparkPi")
        .setMaster("yarn-cluster")
        .launch();
    System.out.println("Waiting for finish...");
    int exitCode = spark.waitFor();
    System.out.println("Finished! Exit code:" + exitCode);

There are two problems:

1. While submitting in "yarn-cluster" mode, the application is successfully 
submitted to YARN and executes successfully (it is visible in the YARN UI, 
reported as SUCCESS and PI value is printed in the output). However, the 
submitting application is never notified that processing is finished - it hangs 
infinitely after printing "Waiting to finish..." The log of the container can 
be found here: http://pastebin.com/LscBjHQc
2. While submitting in "yarn-client" mode, the application does not appear in 
YARN UI and the submitting application hangs at "Waiting to finish..." When 
hanging code is killed, the application shows up in YARN UI and it is reported 
as SUCCESS, but the output is empty (PI value is not printed out). The log of 
the container can be found here: http://pastebin.com/9KHi81r4

I tried to execute the submitting application both with Oracle Java 8 and 7.

Any hints what might be wrong?

Best regards,
Tomasz

Reply via email to