sampleMap is populated from inside a method that is getting called from
updateStateByKey

On Thu, Jun 23, 2016 at 1:13 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> Can you illustrate how sampleMap is populated ?
>
> Thanks
>
> On Thu, Jun 23, 2016 at 12:34 PM, SRK <swethakasire...@gmail.com> wrote:
>
>> Hi,
>>
>> I keep getting the following error in my Spark Streaming every now and
>> then
>> after the  job runs for say around 10 hours. I have those 2 classes
>> registered in kryo as shown below.  sampleMap is a field in SampleSession
>> as shown below. Any suggestion as to how to avoid this would be of great
>> help!!
>>
>> public class SampleSession implements Serializable, Cloneable{
>>    private Map<String, Sample> sampleMap;
>> }
>>
>>  sparkConf.registerKryoClasses(Array( classOf[SampleSession],
>> classOf[Sample]))
>>
>>
>>
>> com.esotericsoftware.kryo.KryoException: java.lang.ClassCastException:
>> com.test.Sample cannot be cast to java.lang.String
>> Serialization trace:
>> sampleMap (com.test.SampleSession)
>> at
>>
>> com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:585)
>>         at
>>
>> com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:213)
>>         at
>> com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:568)
>>         at
>> com.twitter.chill.Tuple6Serializer.write(TupleSerializers.scala:96)
>>         at
>> com.twitter.chill.Tuple6Serializer.write(TupleSerializers.scala:93)
>>         at
>> com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:568)
>>         at
>> com.twitter.chill.Tuple2Serializer.write(TupleSerializers.scala:37)
>>         at
>> com.twitter.chill.Tuple2Serializer.write(TupleSerializers.scala:33)
>>         at
>> com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:568)
>>         at
>>
>> org.apache.spark.serializer.KryoSerializationStream.writeObject(KryoSerializer.scala:158)
>>         at
>>
>> org.apache.spark.serializer.SerializationStream.writeAll(Serializer.scala:153)
>>         at
>>
>> org.apache.spark.storage.BlockManager.dataSerializeStream(BlockManager.scala:1190)
>>         at
>>
>> org.apache.spark.storage.BlockManager.dataSerialize(BlockManager.scala:1199)
>>         at
>> org.apache.spark.storage.MemoryStore.putArray(MemoryStore.scala:132)
>>         at
>> org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:793)
>>         at
>> org.apache.spark.storage.BlockManager.putArray(BlockManager.scala:669)
>>         at
>> org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:175)
>>         at
>> org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:78)
>>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:262)
>>         at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
>>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
>>         at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
>>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
>>         at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
>>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
>>         at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:300)
>>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:264)
>>         at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
>>         at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
>>         at org.apache.spark.scheduler.Task.run(Task.scala:88)
>>         at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
>>         at
>>
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>         at
>>
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>         at java.lang.Thread.run(Thread.java:745)
>>
>> Caused by: java.lang.ClassCastException: com.test.Sample cannot be cast to
>> java.lang.String
>>         at
>>
>> com.esotericsoftware.kryo.serializers.DefaultSerializers$StringSerializer.write(DefaultSerializers.java:146)
>>         at com.esotericsoftware.kryo.Kryo.writeObjectOrNull(Kryo.java:549)
>>         at
>>
>> com.esotericsoftware.kryo.serializers.MapSerializer.write(MapSerializer.java:82)
>>         at
>>
>> com.esotericsoftware.kryo.serializers.MapSerializer.write(MapSerializer.java:17)
>>         at com.esotericsoftware.kryo.Kryo.writeObject(Kryo.java:501)
>>         at
>>
>> com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:564)
>>         ... 37 more
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Kryo-ClassCastException-during-Serialization-deserialization-in-Spark-Streaming-tp27219.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to