Hi,

I am trying to get a simple spark application to run programatically. I
looked at
http://spark.apache.org/docs/2.1.0/api/java/index.html?org/apache/spark/launcher/package-summary.html,
at the following code.

   public class MyLauncher {
     public static void main(String[] args) throws Exception {
       SparkAppHandle handle = new SparkLauncher()
         .setAppResource("/my/app.jar")
         .setMainClass("my.spark.app.Main")
         .setMaster("local")
         .setConf(SparkLauncher.DRIVER_MEMORY, "2g")
         .startApplication();
       // Use handle API to monitor / control application.
     }
   }


I don't have any errors in running this for my application, but I am
running spark in local mode and the launcher class immediately exits after
executing this function. Are we supposed to wait for the process state etc.

Is there a more detailed example of how to monitor inputstreams etc. any
github link or blogpost would help.

Thanks
Nipun

Reply via email to