Code would be very helpful, but it *seems like* you are:

1. Writing in Java
2. Wrapping the *entire app *in a try/catch
3. Executing in local mode

The code that is throwing the exceptions is not executed locally in the
driver process. Spark is executing the failing code on the cluster.

On Sun, May 12, 2019 at 3:37 PM bsikander <behro...@gmail.com> wrote:

> Hi,
> Anyone? This should be a straight forward one :)
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

-- 
Thanks,
Jason

Reply via email to