That feature has not been implemented yet. https://issues.apache.org/jira/browse/SPARK-11033
On Wed, Jun 6, 2018 at 5:18 AM, Behroz Sikander <[email protected]> wrote: > I have a client application which launches multiple jobs in Spark Cluster > using SparkLauncher. I am using Standalone cluster mode. Launching jobs > works fine till now. I use launcher.startApplication() to launch. > > But now, I have a requirement to check the states of my Driver process. I > added a Listener implementing the SparkAppHandle.Listener but I don't get > any events. I am following the approach mentioned here > https://www.linkedin.com/pulse/spark-launcher-amol-kale > > I tried the same code with client code and I receive all the events as > expected. > > So, I am guessing that something different needs to be done in cluster mode. > Is there any example with cluster mode? > > Regards, > Behroz -- Marcelo --------------------------------------------------------------------- To unsubscribe e-mail: [email protected]
