Daniel Darabos created SPARK-5102:
-------------------------------------

             Summary: CompressedMapStatus needs to be registered with Kryo
                 Key: SPARK-5102
                 URL: https://issues.apache.org/jira/browse/SPARK-5102
             Project: Spark
          Issue Type: Bug
    Affects Versions: 1.2.0
            Reporter: Daniel Darabos
            Priority: Minor


After upgrading from Spark 1.1.0 to 1.2.0 I got this exception:

{code}
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: 
Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 
0.0 (TID 0, localhost): java.lang.IllegalArgumentException: Class is not 
registered: org.apache.spark.scheduler.CompressedMapStatus
Note: To register this class use: 
kryo.register(org.apache.spark.scheduler.CompressedMapStatus.class);
        at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:442)
        at 
com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:79)
        at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:472)
        at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:565)
        at 
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:165)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:206)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
{code}

I had to register {{org.apache.spark.scheduler.CompressedMapStatus}} with Kryo. 
I think this should be done in {{spark/serializer/KryoSerializer.scala}}, 
unless instances of this class are not expected to be sent over the wire. 
(Maybe I'm doing something wrong?)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to