Hi Martin,

Yes, that is what is seems. However, it is unlikely that is the case,
because I have all spark classes on my home, which is mounted on NFS to all
nodes. Unless there is something else I am missing...

Edu


On Thu, Oct 3, 2013 at 3:29 PM, Martin Weindel <[email protected]>wrote:

>  Hi Eduardo,
>
> it seems to me that your second problem is caused by inconsistent, i.e.
> different classes in master and worker JVMs.
> Are you sure, that you have replaced the changed FlatMapFunction on all
> worker nodes and also on master?
>
> Regards,
> Martin
>
>  13/10/03 13:27:44 INFO cluster.ClusterTaskSetManager: Lost TID 0 (task
> 1.0:0)
> 13/10/03 13:27:44 INFO cluster.ClusterTaskSetManager: Loss was due to
> java.io.InvalidClassException
> java.io.InvalidClassException:
> org.apache.spark.api.java.function.FlatMapFunction; local class
> incompatible: stream classdesc serialVersionUID = -1748278142466443391,
> local class serialVersionUID = 2220150375729402137
>
>
>

Reply via email to