when you start the spark-shell, its already too late to get the
ApplicationStart event.  Try listening for StageCompleted or JobEnd instead.

On Fri, Apr 17, 2015 at 5:54 PM, Praveen Balaji <
secondorderpolynom...@gmail.com> wrote:

> I'm trying to create a simple SparkListener to get notified of error on
> executors. I do not get any call backs on my SparkListener. Here some
> simple code I'm executing in spark-shell. But I still don't get any
> callbacks on my listener. Am I doing something wrong?
>
> Thanks for any clue you can send my way.
>
> Cheers
> Praveen
>
> ======
> import org.apache.spark.scheduler.SparkListener
> import org.apache.spark.scheduler.SparkListenerApplicationStart
> import org.apache.spark.scheduler.SparkListenerApplicationEnd
> import org.apache.spark.SparkException
>
>     sc.addSparkListener(new SparkListener() {
>       override def onApplicationStart(applicationStart:
> SparkListenerApplicationStart) {
>         println(">>>> onApplicationStart: " + applicationStart.appName);
>       }
>
>       override def onApplicationEnd(applicationEnd:
> SparkListenerApplicationEnd) {
>         println(">>>> onApplicationEnd: " + applicationEnd.time);
>       }
>     });
>
>     sc.parallelize(List(1, 2, 3)).map(throw new
> SparkException("test")).collect();
> =======
>
> output:
>
> scala> org.apache.spark.SparkException: hshsh
>         at $iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
>         at $iwC$$iwC$$iwC.<init>(<console>:34)
>         at $iwC$$iwC.<init>(<console>:36)
>         at $iwC.<init>(<console>:38)
>
>

Reply via email to