Dear Gerard,

I don’t have the problem when running on a single machine.

I think I found the problem; since I build my project using maven, the old 
Spark copy is still cached in my home directory (~/.m2/repository/org…). Every 
time I modify my local spark and build it, I have to clean whatever Maven is 
caching in my home directory.

Thank you
Zuhair Khayyat

On Nov 28, 2013, at 2:29 PM, Gerard Maas <gerard.m...@gmail.com> wrote:

> Hi Zuhair,
> 
> Following the exception, you have two different versions somewhere. Do you
> get the same behavior if you use it in a single node?
> Maybe Spark veterans have more specific tips for you.
> 
> kr, Gerard.
> 
> 
> On Wed, Nov 27, 2013 at 5:00 PM, Zuhair Khayyat 
> <zuhair.khay...@gmail.com>wrote:
> 
>> Dear Gerard,
>> 
>> All servers share the spark binaries through NFS; It is unlikly that other
>> servers contains the old class. I will test later with one server and see
>> if I got the same problem..
>> 
>> Regards,
>> Zuhair Khayyat
>> 
>> On Nov 27, 2013, at 6:29 PM, Gerard Maas <gerard.m...@gmail.com> wrote:
>> 
>>> From the looks of your exception, you modified your local class, but you
>>> forgot to deploy those local changes to the cluster. This error msg:
>>> classdesc serialVersionUID = 5151096093324583655, local class
>>> serialVersionUID = 9012954318378784201
>>> 
>>> indicates that a version being de-serialized is different from the local
>>> version. Make sure you deploy your changes across your Spark cluster.
>>> 
>>> -kr, Gerard.
>>> 
>>> 
>>> On Wed, Nov 27, 2013 at 4:22 PM, Zuhair Khayyat <
>> zuhair.khay...@gmail.com>wrote:
>>> 
>>>> Dear SPARK members,
>>>> 
>>>> I am trying to start developing on SPARK source code. I have added a new
>>>> dummy function in RDD.scala to test if it compiles and runs. The
>> modified
>>>> Spark compiled correctly but when I execute my code I got the following
>>>> error:
>>>> 
>>>> java.io.InvalidClassException: spark.RDD; local class incompatible:
>> stream
>>>> classdesc serialVersionUID = 5151096093324583655, local class
>>>> serialVersionUID = 9012954318378784201
>>>>       at
>>>> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
>>>>       at
>>>> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1620)
>>>>       at
>>>> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1515)
>>>>       at
>>>> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1620)
>>>>       at
>>>> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1515)
>>>>       at
>>>> 
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1769)
>>>>       at
>>>> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>>>>       at
>> java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>>>>       at
>>>> spark.JavaDeserializationStream.readObject(JavaSerializer.scala:23)
>>>>       at
>>>> spark.scheduler.ShuffleMapTask$.deserializeInfo(ShuffleMapTask.scala:54)
>>>>       at
>>>> spark.scheduler.ShuffleMapTask.readExternal(ShuffleMapTask.scala:111)
>>>>       at
>>>> java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1835)
>>>>       at
>>>> 
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1794)
>>>>       at
>>>> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>>>>       at
>> java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>>>>       at
>>>> spark.JavaDeserializationStream.readObject(JavaSerializer.scala:23)
>>>>       at
>>>> spark.JavaSerializerInstance.deserialize(JavaSerializer.scala:45)
>>>>       at spark.executor.Executor$TaskRunner.run(Executor.scala:96)
>>>>       at
>>>> 
>>>> 
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>       at
>>>> 
>>>> 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>       at java.lang.Thread.run(Thread.java:724)
>>>> 13/11/27 17:47:43 ERROR executor.StandaloneExecutorBackend: Driver or
>>>> worker disconnected! Shutting down.
>>>> 
>>>> Can you please help me to find out what went wrong? Thank you
>>>> 
>>>> Zuhair Khayyat
>>>> 
>> 
>> 

Reply via email to