[ 
https://issues.apache.org/jira/browse/SPARK-36869?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17422215#comment-17422215
 ] 

Dongjoon Hyun commented on SPARK-36869:
---------------------------------------

You can test Scala 2.12.15 with Apache Spark 3.2.0 RC6 binaries.
- https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc6-bin/

> Spark job fails due to java.io.InvalidClassException: 
> scala.collection.mutable.WrappedArray$ofRef; local class incompatible
> ---------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-36869
>                 URL: https://issues.apache.org/jira/browse/SPARK-36869
>             Project: Spark
>          Issue Type: Bug
>          Components: Input/Output
>    Affects Versions: 3.1.2
>         Environment: * RHEL 8.4
>  * Java 11.0.12
>  * Spark 3.1.2 (only prebuilt with *2.12.10)*
>  * Scala *2.12.14* for the application code
>            Reporter: Hamid EL MAAZOUZ
>            Priority: Blocker
>              Labels: scala, serialization, spark
>
> This is a Scala problem. It has been already reported here 
> [https://github.com/scala/bug/issues/5046] and a fix has been merged here 
> [https://github.com/scala/scala/pull/9166.|https://github.com/scala/scala/pull/9166]
> According to 
> [https://github.com/scala/bug/issues/5046#issuecomment-928108088], the *fix* 
> is available on *Scala 2.12.14*, but *Spark 3.0+* is only pre-built with 
> Scala *2.12.10*.
>  
>  * Stacktrace of the failure: (Taken from stderr of a worker process)
> {code:java}
> Spark Executor Command: "/usr/java/jdk-11.0.12/bin/java" "-cp" 
> "/opt/apache/spark-3.1.2-bin-hadoop3.2/conf/:/opt/apache/spark-3.1.2-bin-hadoop3.2/jars/*"
>  "-Xmx1024M" "-Dspark.driver.port=45887" 
> "org.apache.spark.executor.CoarseGrainedExecutorBackend" "--driver-url" 
> "spark://CoarseGrainedScheduler@192.168.0.191:45887" "--executor-id" "0" 
> "--hostname" "192.168.0.191" "--cores" "12" "--app-id" 
> "app-20210927231035-0000" "--worker-url" "spark://Worker@192.168.0.191:35261"
> ========================================Using Spark's default log4j profile: 
> org/apache/spark/log4j-defaults.properties
> 21/09/27 23:10:36 INFO CoarseGrainedExecutorBackend: Started daemon with 
> process name: 18957@localhost
> 21/09/27 23:10:36 INFO SignalUtils: Registering signal handler for TERM
> 21/09/27 23:10:36 INFO SignalUtils: Registering signal handler for HUP
> 21/09/27 23:10:36 INFO SignalUtils: Registering signal handler for INT
> 21/09/27 23:10:36 WARN Utils: Your hostname, localhost resolves to a loopback 
> address: 127.0.0.1; using 192.168.0.191 instead (on interface wlp82s0)
> 21/09/27 23:10:36 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to 
> another address
> WARNING: An illegal reflective access operation has occurred
> WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform 
> (file:/opt/apache/spark-3.1.2-bin-hadoop3.2/jars/spark-unsafe_2.12-3.1.2.jar) 
> to constructor java.nio.DirectByteBuffer(long,int)
> WARNING: Please consider reporting this to the maintainers of 
> org.apache.spark.unsafe.Platform
> WARNING: Use --illegal-access=warn to enable warnings of further illegal 
> reflective access operations
> WARNING: All illegal access operations will be denied in a future release
> 21/09/27 23:10:36 WARN NativeCodeLoader: Unable to load native-hadoop library 
> for your platform... using builtin-java classes where applicable
> 21/09/27 23:10:36 INFO SecurityManager: Changing view acls to: hamidelmaazouz
> 21/09/27 23:10:36 INFO SecurityManager: Changing modify acls to: 
> hamidelmaazouz
> 21/09/27 23:10:36 INFO SecurityManager: Changing view acls groups to: 
> 21/09/27 23:10:36 INFO SecurityManager: Changing modify acls groups to: 
> 21/09/27 23:10:36 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users  with view permissions: 
> Set(hamidelmaazouz); groups with view permissions: Set(); users  with modify 
> permissions: Set(hamidelmaazouz); groups with modify permissions: Set()
> 21/09/27 23:10:37 INFO TransportClientFactory: Successfully created 
> connection to /192.168.0.191:45887 after 44 ms (0 ms spent in bootstraps)
> 21/09/27 23:10:37 WARN TransportChannelHandler: Exception in connection from 
> /192.168.0.191:45887
> java.io.InvalidClassException: scala.collection.mutable.WrappedArray$ofRef; 
> local class incompatible: stream classdesc serialVersionUID = 
> 3456489343829468865, local class serialVersionUID = 1028182004549731694
>       at 
> java.base/java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:689)
>       at 
> java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2012)
>       at 
> java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1862)
>       at 
> java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2169)
>       at 
> java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1679)
>       at 
> java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2464)
>       at 
> java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2358)
>       at 
> java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2196)
>       at 
> java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1679)
>       at 
> java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:493)
>       at 
> java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:451)
>       at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
>       at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:109)
>       at 
> org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$deserialize$2(NettyRpcEnv.scala:299)
>       at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
>       at 
> org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:352)
>       at 
> org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$deserialize$1(NettyRpcEnv.scala:298)
>       at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
>       at 
> org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:298)
>       at 
> org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$7(NettyRpcEnv.scala:246)
>       at 
> org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$7$adapted(NettyRpcEnv.scala:246)
>       at 
> org.apache.spark.rpc.netty.RpcOutboxMessage.onSuccess(Outbox.scala:90)
>       at 
> org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:195)
>       at 
> org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:142)
>       at 
> org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:53)
>       at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>       at 
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>       at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>       at 
> org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:102)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>       at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>       at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
>       at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
>       at 
> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
>       at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
>       at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
>       at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
>       at 
> io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
>       at 
> io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
>       at 
> io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
>       at java.base/java.lang.Thread.run(Thread.java:834)
> Exception in thread "main" java.lang.reflect.UndeclaredThrowableException
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1748)
>       at 
> org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:61)
>       at 
> org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:393)
>       at 
> org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:382)
>       at 
> org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
> Caused by: org.apache.spark.rpc.RpcTimeoutException: Futures timed out after 
> [120 seconds]. This timeout is controlled by spark.rpc.askTimeout
>       at 
> org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:47)
>       at 
> org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:62)
>       at 
> org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:58)
>       at 
> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
>       at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:76)
>       at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103)
>       at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87)
>       at 
> org.apache.spark.executor.CoarseGrainedExecutorBackend$.$anonfun$run$7(CoarseGrainedExecutorBackend.scala:421)
>       at 
> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:62)
>       at 
> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:61)
>       at java.base/java.security.AccessController.doPrivileged(Native Method)
>       at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>       ... 4 more
> Caused by: java.util.concurrent.TimeoutException: Futures timed out after 
> [120 seconds]
>       at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:259)
>       at 
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:263)
>       at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:293)
>       at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
>       ... 12 more
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to