[ 
https://issues.apache.org/jira/browse/SPARK-7483?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14537762#comment-14537762
 ] 

Tomasz Bartczak commented on SPARK-7483:
----------------------------------------

hmm, class org.apache.spark.mllib.fpm.FPTree is private and in spark-mllib. 
Spark is registering classes in spark-core in 
org.apache.spark.serializer.KryoSerializer#toRegister so it is not 
straightforward how to do that easily.

And I do not think this is the case of registering, looking at spark 
serialization doc "Finally, if you don’t register your custom classes, Kryo 
will still work, but it will have to store the full class name with each 
object, which is wasteful."

> [MLLib] Using Kryo with FPGrowth fails with an exception
> --------------------------------------------------------
>
>                 Key: SPARK-7483
>                 URL: https://issues.apache.org/jira/browse/SPARK-7483
>             Project: Spark
>          Issue Type: Bug
>          Components: MLlib
>    Affects Versions: 1.3.1
>            Reporter: Tomasz Bartczak
>            Priority: Minor
>
> When using FPGrowth algorithm with KryoSerializer - Spark fails with
> {code}
> Job aborted due to stage failure: Task 0 in stage 9.0 failed 1 times, most 
> recent failure: Lost task 0.0 in stage 9.0 (TID 16, localhost): 
> com.esotericsoftware.kryo.KryoException: java.lang.IllegalArgumentException: 
> Can not set final scala.collection.mutable.ListBuffer field 
> org.apache.spark.mllib.fpm.FPTree$Summary.nodes to 
> scala.collection.mutable.ArrayBuffer
> Serialization trace:
> nodes (org.apache.spark.mllib.fpm.FPTree$Summary)
> org$apache$spark$mllib$fpm$FPTree$$summaries 
> (org.apache.spark.mllib.fpm.FPTree)
> {code}
> This can be easily reproduced in spark codebase by setting 
> {code}
> conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
> {code} and running FPGrowthSuite.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to