[ https://issues.apache.org/jira/browse/SPARK-36627?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean R. Owen resolved SPARK-36627. ---------------------------------- Fix Version/s: 3.3.0 Resolution: Fixed Issue resolved by pull request 33879 [https://github.com/apache/spark/pull/33879] > Tasks with Java proxy objects fail to deserialize > ------------------------------------------------- > > Key: SPARK-36627 > URL: https://issues.apache.org/jira/browse/SPARK-36627 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 3.0.3 > Reporter: Samuel Souza > Assignee: Samuel Souza > Priority: Minor > Fix For: 3.3.0 > > > In JavaSerializer.JavaDeserializationStream we override resolveClass of > ObjectInputStream to use the threads' contextClassLoader. However, we do not > override resolveProxyClass, which is used when deserializing Java proxy > objects, which makes spark use the wrong classloader when deserializing > objects, which causes the job to fail with the following exception: > {code} > Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: > Task 0 in stage 1.0 failed 4 times, most recent failure: Lost task 0.3 in > stage 1.0 (TID 4, <host>, executor 1): java.lang.ClassNotFoundException: > <class> > at > java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) > at > java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178) > at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522) > at java.base/java.lang.Class.forName0(Native Method) > at java.base/java.lang.Class.forName(Class.java:398) > at > java.base/java.io.ObjectInputStream.resolveProxyClass(ObjectInputStream.java:829) > at > java.base/java.io.ObjectInputStream.readProxyDesc(ObjectInputStream.java:1917) > ... > at > org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76) > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org