after upgrading from spark 0.7 to spark 0.8 i can no longer access any
files on HDFS.
i see the error below. any ideas?

i am running spark standalone on a cluster that also has CDH4.3.0 and
rebuild spark accordingly. the jars in lib_managed look good to me.

i noticed similar errors in the mailing list but found no suggested
solutions.

thanks! koert


13/10/17 17:43:23 ERROR Executor: Exception in task ID 0
java.io.EOFException
        at 
java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2703)
        at java.io.ObjectInputStream.readFully(ObjectInputStream.java:1008)
        at 
org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:68)
        at 
org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:106)
        at org.apache.hadoop.io.UTF8.readChars(UTF8.java:258)
        at org.apache.hadoop.io.UTF8.readString(UTF8.java:250)
        at org.apache.hadoop.mapred.FileSplit.readFields(FileSplit.java:87)
        at 
org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:280)
        at 
org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:75)
        at 
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:969)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1852)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1756)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1326)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1950)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1874)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1756)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1326)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:348)
        at 
org.apache.spark.scheduler.ResultTask.readExternal(ResultTask.scala:135)
        at 
java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1795)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1754)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1326)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:348)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:39)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:61)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:153)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
        at java.lang.Thread.run(Thread.java:662)

Reply via email to