>> Code would be very helpful,
I will try to put together something to post here.

>> 1. Writing in Java
I am using Scala


>> Wrapping the entire app in a try/catch
Once the SparkContext object is created, a Future is started where actions
and transformations are defined and streaming context is started. 
I am using spark-jobserver and here is how the job is started (job.runJob()
defines all the actions/transformations and starts the streaming context). 
See this
<https://github.com/spark-jobserver/spark-jobserver/blob/master/job-server/src/main/scala/spark/jobserver/JobManagerActor.scala#L570>
 
. As mentioned in my original message, I sometimes am able to catch
exception  in this block
<https://github.com/spark-jobserver/spark-jobserver/blob/master/job-server/src/main/scala/spark/jobserver/JobManagerActor.scala#L606>
 
.

>> 3. Executing in local mode
I am running in cluster mode.


>> The code that is throwing the exceptions is not executed locally in the
>> driver process. Spark is executing the failing code on the cluster.
Yea the code is executing in executors but once it fails 4 times, the
exceptions seems to be getting thrown on driver side.





--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to