[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14646699#comment-14646699
 ] 

Steve Loughran commented on SPARK-6152:
---------------------------------------

Chill and Kryo need to be in sync; there's also the need to be compatible with 
the version Hive uses, (which has historically been addressed with custom 
versions of Hive).

If spark could jump to Kryo 3.x, classpath conflict with hive would go away, 
provided the wire formats of serialized classes were compatible: hive's 
spark-client JAR uses kryo 2.2.x to talk to spark.

> Spark does not support Java 8 compiled Scala classes
> ----------------------------------------------------
>
>                 Key: SPARK-6152
>                 URL: https://issues.apache.org/jira/browse/SPARK-6152
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.2.1
>         Environment: Java 8+
> Scala 2.11
>            Reporter: Ronald Chen
>            Priority: Minor
>
> Spark uses reflectasm to check Scala closures which fails if the *user 
> defined Scala closures* are compiled to Java 8 class version
> The cause is reflectasm does not support Java 8
> https://github.com/EsotericSoftware/reflectasm/issues/35
> Workaround:
> Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
> require any Java 8 features
> Stack trace:
> {code}
> java.lang.IllegalArgumentException
>       at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
>  Source)
>       at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
>  Source)
>       at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown
>  Source)
>       at 
> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
>       at 
> org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
>       at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
>       at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
>       at org.apache.spark.rdd.RDD.map(RDD.scala:288)
>       at ...my Scala 2.11 compiled to Java 8 code calling into spark
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to