Re: Invalid Class Exception

2014-09-03 Thread niranda
Hi,

I'm getting the same error while manually setting up Spark cluster.

Has there been any update about this error?

Rgds

Niranda



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Invalid-Class-Exception-tp6859p13346.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Invalid Class Exception

2014-06-06 Thread Jenny Zhao
we experienced similar issue in our environment, below is the whole stack
trace,  it works fine if we run local mode, if we run it in cluster mode
(even with Master and 1 worker on the same node), we have this
serialversionUID issue. we use Spark 1.0.0 and compiled with JDK6.

here is a link about serialVersionUID and suggestion on using it for
Serializable class.. which suggests to define a serialVersionUID in the
serializable class
http://stackoverflow.com/questions/285793/what-is-a-serialversionuid-and-why-should-i-use-it


14/06/05 09:52:18 WARN scheduler.TaskSetManager: Lost TID 9 (task 1.0:9)
14/06/05 09:52:18 WARN scheduler.TaskSetManager: Loss was due to
java.io.InvalidClassException
java.io.InvalidClassException: org.apache.spark.SerializableWritable; local
class incompatible: stream classdesc serialVersionUID =
6301214776158303468, local class serialVersionUID = -7785455416944904980
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:630)
at
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1600)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1513)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1749)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1346)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:365)
at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
at
org.apache.spark.broadcast.HttpBroadcast$.read(HttpBroadcast.scala:165)
at
org.apache.spark.broadcast.HttpBroadcast.readObject(HttpBroadcast.scala:56)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1039)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1866)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1770)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1346)
at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1964)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1770)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1346)
at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1964)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1770)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1346)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:365)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1039)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1866)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1770)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1346)
at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1964)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1770)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1346)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:365)
at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
at
org.apache.spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:63)
at
org.apache.spark.scheduler.ResultTask.readExternal(ResultTask.scala:139)
at
java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1809)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1768)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1346)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:365)
at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:62)
at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mcV$sp(Executor.scala:195)
at
org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.scala:49)
at 

Re: Invalid Class Exception

2014-06-04 Thread Suman Somasundar

I am building Spark by myself and I am using Java 7 to both build and run.

I will try with Java 6.

Thanks,
Suman.

On 6/3/2014 7:18 PM, Matei Zaharia wrote:

What Java version do you have, and how did you get Spark (did you build it 
yourself by any chance or download a pre-built one)? If you build Spark 
yourself you need to do it with Java 6 — it’s a known issue because of the way 
Java 6 and 7 package JAR files. But I haven’t seen it result in this particular 
error.

Matei

On Jun 3, 2014, at 5:18 PM, Suman Somasundar suman.somasun...@oracle.com 
wrote:


Hi all,

I get the following exception when using Spark to run example k-means program.  
I am using Spark 1.0.0 and running the program locally.

java.io.InvalidClassException: scala.Tuple2; invalid descriptor for field _1
at java.io.ObjectStreamClass.readNonProxy(ObjectStreamClass.java:697)
at 
java.io.ObjectInputStream.readClassDescriptor(ObjectInputStream.java:827)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1583)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1514)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1750)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63)
at 
org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:125)
at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
at 
org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:30)
at 
org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
at 
org.apache.spark.Aggregator.combineCombinersByKey(Aggregator.scala:87)
at 
org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKey$3.apply(PairRDDFunctions.scala:101)
at 
org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKey$3.apply(PairRDDFunctions.scala:100)
at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:582)
at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:582)
at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:111)
at org.apache.spark.scheduler.Task.run(Task.scala:51)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:187)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.lang.IllegalArgumentException: illegal signature
at java.io.ObjectStreamField.init(ObjectStreamField.java:119)
at java.io.ObjectStreamClass.readNonProxy(ObjectStreamClass.java:695)
... 26 more

Anyone know why this is happening?

Thanks,
Suman.




Re: Invalid Class Exception

2014-06-04 Thread Suman Somasundar


I tried building with Java 6 and also tried the pre-built packages. I am 
still getting the same error.


It works fine when I run it on a machine with Solaris OS and X-86 
architecture.


But, it does not work with Solaris OS and Sparc architecture.

Any ideas, why this would happen?

Thanks,
Suman.

On 6/4/2014 10:48 AM, Suman Somasundar wrote:
I am building Spark by myself and I am using Java 7 to both build and 
run.


I will try with Java 6.

Thanks,
Suman.

On 6/3/2014 7:18 PM, Matei Zaharia wrote:
What Java version do you have, and how did you get Spark (did you 
build it yourself by any chance or download a pre-built one)? If you 
build Spark yourself you need to do it with Java 6 — it’s a known 
issue because of the way Java 6 and 7 package JAR files. But I 
haven’t seen it result in this particular error.


Matei

On Jun 3, 2014, at 5:18 PM, Suman Somasundar 
suman.somasun...@oracle.com wrote:



Hi all,

I get the following exception when using Spark to run example 
k-means program.  I am using Spark 1.0.0 and running the program 
locally.


java.io.InvalidClassException: scala.Tuple2; invalid descriptor for 
field _1
at 
java.io.ObjectStreamClass.readNonProxy(ObjectStreamClass.java:697)
at 
java.io.ObjectInputStream.readClassDescriptor(ObjectInputStream.java:827)
at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1583)
at 
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1514)
at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1750)
at 
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
at 
java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63)
at 
org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:125)
at 
org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
at 
scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
at 
org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:30)
at 
org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
at 
org.apache.spark.Aggregator.combineCombinersByKey(Aggregator.scala:87)
at 
org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKey$3.apply(PairRDDFunctions.scala:101)
at 
org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKey$3.apply(PairRDDFunctions.scala:100)

at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:582)
at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:582)
at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35) 

at 
org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)

at org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
at 
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:111)

at org.apache.spark.scheduler.Task.run(Task.scala:51)
at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:187)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)

at java.lang.Thread.run(Thread.java:722)
Caused by: java.lang.IllegalArgumentException: illegal signature
at java.io.ObjectStreamField.init(ObjectStreamField.java:119)
at 
java.io.ObjectStreamClass.readNonProxy(ObjectStreamClass.java:695)

... 26 more

Anyone know why this is happening?

Thanks,
Suman.






Re: Invalid Class Exception

2014-06-03 Thread Matei Zaharia
What Java version do you have, and how did you get Spark (did you build it 
yourself by any chance or download a pre-built one)? If you build Spark 
yourself you need to do it with Java 6 — it’s a known issue because of the way 
Java 6 and 7 package JAR files. But I haven’t seen it result in this particular 
error.

Matei

On Jun 3, 2014, at 5:18 PM, Suman Somasundar suman.somasun...@oracle.com 
wrote:

 
 Hi all,
 
 I get the following exception when using Spark to run example k-means 
 program.  I am using Spark 1.0.0 and running the program locally.
 
 java.io.InvalidClassException: scala.Tuple2; invalid descriptor for field _1
at java.io.ObjectStreamClass.readNonProxy(ObjectStreamClass.java:697)
at 
 java.io.ObjectInputStream.readClassDescriptor(ObjectInputStream.java:827)
at 
 java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1583)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1514)
at 
 java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1750)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
at 
 org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63)
at 
 org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:125)
at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
at 
 org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:30)
at 
 org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
at 
 org.apache.spark.Aggregator.combineCombinersByKey(Aggregator.scala:87)
at 
 org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKey$3.apply(PairRDDFunctions.scala:101)
at 
 org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKey$3.apply(PairRDDFunctions.scala:100)
at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:582)
at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:582)
at 
 org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:111)
at org.apache.spark.scheduler.Task.run(Task.scala:51)
at 
 org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:187)
at 
 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at 
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)
 Caused by: java.lang.IllegalArgumentException: illegal signature
at java.io.ObjectStreamField.init(ObjectStreamField.java:119)
at java.io.ObjectStreamClass.readNonProxy(ObjectStreamClass.java:695)
... 26 more
 
 Anyone know why this is happening?
 
 Thanks,
 Suman.



Re: Invalid Class Exception

2014-05-28 Thread Suman Somasundar


On 5/27/2014 1:28 PM, Marcelo Vanzin wrote:

On Tue, May 27, 2014 at 1:05 PM, Suman Somasundar
suman.somasun...@oracle.com wrote:

I am running this on a Solaris machine with logical partitions. All the
partitions (workers) access the same Spark folder.

Can you check whether you have multiple versions of the offending
class (org.apache.spark.SerializableWritable) in the classpath of your
apps? Maybe you do and different nodes are loading jars in different


I checked all the org.apache.spark.SerializableWritable classes and all 
of them have the same

serialVersionUID


On 5/23/2014 9:44 PM, Andrew Or wrote:

That means not all of your driver and executors have the same version of
Spark. Are you on a standalone EC2 cluster? If so, one way to fix this is to
run the following on the master node:

/root/spark-ec2/copy-dir --delete /root/spark

This syncs all of Spark across your cluster, configs, jars and everything.


2014-05-23 15:20 GMT-07:00 Suman Somasundar suman.somasun...@oracle.com:

Hi,

I get the following exception when using Spark to run various programs.

java.io.InvalidClassException: org.apache.spark.SerializableWritable;
local class incompatible: stream classdesc serialVersionUID =
6301214776158303468, local class serialVersionUID = -7785455416944904980
 at
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:604)
 at
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1601)
 at
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1514)
 at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1750)
 at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
 at
java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
 at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
 at
org.apache.spark.broadcast.HttpBroadcast$.read(HttpBroadcast.scala:165)
 at
org.apache.spark.broadcast.HttpBroadcast.readObject(HttpBroadcast.scala:56)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:601)
 at
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1004)
 at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1866)
 at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
 at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
 at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1964)
 at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)
 at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
 at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
 at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1964)
 at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)
 at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
 at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
 at
java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
 at
scala.collection.immutable.$colon$colon.readObject(List.scala:362)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:601)
 at
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1004)
 at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1866)
 at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
 at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
 at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1964)
 at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)
 at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
 at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
 at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1964)
 at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)
 at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
 at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
 at
java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
 at
scala.collection.immutable.$colon$colon.readObject(List.scala:362)
 

Re: Invalid Class Exception

2014-05-27 Thread Suman Somasundar


I am running this on a Solaris machine with logical partitions. All the 
partitions (workers) access the same Spark folder.


Thanks,
Suman.

On 5/23/2014 9:44 PM, Andrew Or wrote:
That means not all of your driver and executors have the same version 
of Spark. Are you on a standalone EC2 cluster? If so, one way to fix 
this is to run the following on the master node:


/root/spark-ec2/copy-dir --delete /root/spark

This syncs all of Spark across your cluster, configs, jars and everything.


2014-05-23 15:20 GMT-07:00 Suman Somasundar 
suman.somasun...@oracle.com mailto:suman.somasun...@oracle.com:


Hi,

I get the following exception when using Spark to run various
programs.

java.io.InvalidClassException:
org.apache.spark.SerializableWritable; local class incompatible:
stream classdesc serialVersionUID = 6301214776158303468, local
class serialVersionUID = -7785455416944904980
at
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:604)
at
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1601)
at
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1514)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1750)
at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
at
java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
at

org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
at
org.apache.spark.broadcast.HttpBroadcast$.read(HttpBroadcast.scala:165)
at
org.apache.spark.broadcast.HttpBroadcast.readObject(HttpBroadcast.scala:56)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1004)
at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1866)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1964)
at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1964)
at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
at
java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
at
scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1004)
at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1866)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1964)
at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1964)
at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
at
java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
at
scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native 

Re: Invalid Class Exception

2014-05-27 Thread Marcelo Vanzin
On Tue, May 27, 2014 at 1:05 PM, Suman Somasundar
suman.somasun...@oracle.com wrote:
 I am running this on a Solaris machine with logical partitions. All the
 partitions (workers) access the same Spark folder.

Can you check whether you have multiple versions of the offending
class (org.apache.spark.SerializableWritable) in the classpath of your
apps? Maybe you do and different nodes are loading jars in different
order.

 On 5/23/2014 9:44 PM, Andrew Or wrote:

 That means not all of your driver and executors have the same version of
 Spark. Are you on a standalone EC2 cluster? If so, one way to fix this is to
 run the following on the master node:

 /root/spark-ec2/copy-dir --delete /root/spark

 This syncs all of Spark across your cluster, configs, jars and everything.


 2014-05-23 15:20 GMT-07:00 Suman Somasundar suman.somasun...@oracle.com:

 Hi,

 I get the following exception when using Spark to run various programs.

 java.io.InvalidClassException: org.apache.spark.SerializableWritable;
 local class incompatible: stream classdesc serialVersionUID =
 6301214776158303468, local class serialVersionUID = -7785455416944904980
 at
 java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:604)
 at
 java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1601)
 at
 java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1514)
 at
 java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1750)
 at
 java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
 at
 java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
 at
 org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
 at
 org.apache.spark.broadcast.HttpBroadcast$.read(HttpBroadcast.scala:165)
 at
 org.apache.spark.broadcast.HttpBroadcast.readObject(HttpBroadcast.scala:56)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:601)
 at
 java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1004)
 at
 java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1866)
 at
 java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
 at
 java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
 at
 java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1964)
 at
 java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)
 at
 java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
 at
 java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
 at
 java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1964)
 at
 java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)
 at
 java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
 at
 java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
 at
 java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
 at
 scala.collection.immutable.$colon$colon.readObject(List.scala:362)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:601)
 at
 java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1004)
 at
 java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1866)
 at
 java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
 at
 java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
 at
 java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1964)
 at
 java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)
 at
 java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
 at
 java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
 at
 java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1964)
 at
 java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1888)
 at
 java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
 at
 java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
 at
 java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
 at
 scala.collection.immutable.$colon$colon.readObject(List.scala:362)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at