Who's your spark provider? EMR, Azure, Databricks, etc.? Maybe contact
them, since they've probably applied some patches

Also have you checked `stdout` for some Segfaults? I vaguely remember
getting `Task failed while writing rows at` and seeing some segfaults that
caused that

On Wed, Feb 28, 2018 at 2:07 PM, unk1102 <umesh.ka...@gmail.com> wrote:

> Hi thanks Vadim you are right I saw that line already 468 I dont see any
> code
> it is just comment yes I am sure I am using all spark-* jar which is built
> for spark 2.2.0 and Scala 2.11. I am also stuck unfortunately with these
> errors not sure how to solve them.
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to