[ https://issues.apache.org/jira/browse/SPARK-35496?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-35496: ---------------------------------- Description: Scala 2.13.6 released(https://github.com/scala/scala/releases/tag/v2.13.6) However, we skip 2.13.6 because there is a breaking behavior change at 2.13.6 which is different from both Scala 2.13.5 and Scala 3. - https://github.com/scala/bug/issues/12403 {code} scala3-3.0.0:$ bin/scala scala> Array.empty[Double].intersect(Array(0.0)) val res0: Array[Double] = Array() scala-2.13.6:$ bin/scala Welcome to Scala 2.13.6 (OpenJDK 64-Bit Server VM, Java 1.8.0_292). Type in expressions for evaluation. Or try :help. scala> Array.empty[Double].intersect(Array(0.0)) java.lang.ClassCastException: [Ljava.lang.Object; cannot be cast to [D ... 32 elided {code} was:Scala 2.13.6 released(https://github.com/scala/scala/releases/tag/v2.13.6) > Upgrade Scala 2.13 to 2.13.7 > ---------------------------- > > Key: SPARK-35496 > URL: https://issues.apache.org/jira/browse/SPARK-35496 > Project: Spark > Issue Type: Sub-task > Components: Build > Affects Versions: 3.2.0 > Reporter: Yang Jie > Priority: Major > > Scala 2.13.6 released(https://github.com/scala/scala/releases/tag/v2.13.6) > However, we skip 2.13.6 because there is a breaking behavior change at 2.13.6 > which is different from both Scala 2.13.5 and Scala 3. > - https://github.com/scala/bug/issues/12403 > {code} > scala3-3.0.0:$ bin/scala > scala> Array.empty[Double].intersect(Array(0.0)) > val res0: Array[Double] = Array() > scala-2.13.6:$ bin/scala > Welcome to Scala 2.13.6 (OpenJDK 64-Bit Server VM, Java 1.8.0_292). > Type in expressions for evaluation. Or try :help. > scala> Array.empty[Double].intersect(Array(0.0)) > java.lang.ClassCastException: [Ljava.lang.Object; cannot be cast to [D > ... 32 elided > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org