Have a try catch inside the map. See the following example.

val csvRDD = myRDD.map(x => {
>       var index="null"
>       try {
>         index = x.toString.split(",")(0)
>       }catch{ case e: Exception => println("Exception!! => " + e) }
>       (index, x)
>     })


Thanks
Best Regards

On Thu, Oct 9, 2014 at 6:46 PM, poiuytrez <guilla...@databerries.com> wrote:

> Hi,
>
> I am parsing a csv file using Spark using the map function. One of the line
> of the csv file make a task fail (then the whole job fail). Is there a way
> to do some debugging to find the line which does fail ?
>
> Best regards,
> poiuytrez
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Bug-a-spark-task-tp16029.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to