Hi,

I am launching a spark Job from Java application using SparkLauncher. My code 
is as follows:

SparkAppHandle jobHandle;
try {
    jobHandle = new SparkLauncher()
            .setSparkHome("C:\\spark-2.0.0-bin-hadoop2.7")
            .setAppResource("hdfs://server/inputs/test.jar")
            .setMainClass("com.test.TestJob")
            .setMaster("spark://server:6066")
            .setVerbose(true)
            .setDeployMode("cluster")
            .addAppArgs("abc")
            .startApplication();

} catch (IOException e) {
    throw new RuntimeException(e);
}

while(!jobHandle.getState().isFinal());


I can see my job running on SparkUI and also it is finishing without any errors.

However my java application never terminates since jobHandle.getState() always 
returns UNKNOWN state. what am I missing here? My spark API version is 2.0.0. 
One more detail that might be relevant is that my launcher application is 
running on windows.

Regards,
Vatsal

Reply via email to