[ 
https://issues.apache.org/jira/browse/SPARK-25047?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-25047:
------------------------------
    Docs Text: 
Release Notes text:

In Scala 2.12, in some rare cases, Spark jobs will fail with an error like 
"java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda ... of type scala.Function1". This typically 
occurs when a closure or function definition is assigned to a class member. It 
can typically be avoided by changing this to a simple function definition -- 
from a "val" to a "def".

> Can't assign SerializedLambda to scala.Function1 in deserialization of 
> BucketedRandomProjectionLSHModel
> -------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-25047
>                 URL: https://issues.apache.org/jira/browse/SPARK-25047
>             Project: Spark
>          Issue Type: Sub-task
>          Components: ML
>    Affects Versions: 2.4.0
>            Reporter: Sean Owen
>            Assignee: Sean Owen
>            Priority: Major
>             Fix For: 2.4.0
>
>
> Another distinct test failure:
> {code:java}
> - BucketedRandomProjectionLSH: streaming transform *** FAILED ***
>   org.apache.spark.sql.streaming.StreamingQueryException: Query [id = 
> 7f34fb07-a718-4488-b644-d27cfd29ff6c, runId = 
> 0bbc0ba2-2952-4504-85d6-8aba877ba01b] terminated with exception: Job aborted 
> due to stage failure: Task 0 in stage 16.0 failed 1 times, most recent 
> failure: Lost task 0.0 in stage 16.0 (TID 16, localhost, executor driver): 
> java.lang.ClassCastException: cannot assign instance of 
> java.lang.invoke.SerializedLambda to field 
> org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.hashFunction of 
> type scala.Function1 in instance of 
> org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel
> ...
>   Cause: java.lang.ClassCastException: cannot assign instance of 
> java.lang.invoke.SerializedLambda to field 
> org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel.hashFunction of 
> type scala.Function1 in instance of 
> org.apache.spark.ml.feature.BucketedRandomProjectionLSHModel
>   at 
> java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
>   at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
>   at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2284)
> ...{code}
> Here the different nature of a Java 8 LMF closure trips of Java 
> serialization/deserialization. I think this can be patched by manually 
> implementing the Java serialization here, and don't see other instances (yet).
> Also wondering if this "val" can be a "def".



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to