Hi All,
able to run my simple spark job Read and write to S3 in local ,when i move
to cluster gettng below cast exception.Spark Environment a using 2.0.1.
please help out if any has faced this kind of issue already.



02/18 10:35:23 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0,
ip-172-31-45-63.ec2.internal): java.io.IOException:
java.lang.ClassCastException: cannot assign instance of scala.Some to field
org.apache.spark.util.AccumulatorMetadata.name of type scala.Option in
instance of org.apache.spark.util.AccumulatorMetadata
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1283)
        at 
org.apache.spark.util.AccumulatorV2.readObject(AccumulatorV2.scala:171)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2122)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:253)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassCastException: cannot assign instance of
scala.Some to field org.apache.spark.util.AccumulatorMetadata.name of type
scala.Option in instance of org.apache.spark.util.AccumulatorMetadata
        at
java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2133)
        at 
java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1305)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2237)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155)



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Class-Cast-Exception-while-read-from-GS-and-write-to-S3-I-feel-gettng-while-writeing-to-s3-tp28403.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to