[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-11-12 Thread Oskar Blom (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15003634#comment-15003634
 ] 

Oskar Blom commented on SPARK-6152:
---

Nice - when is 1.6.0 due for release?

> Spark does not support Java 8 compiled Scala classes
> 
>
> Key: SPARK-6152
> URL: https://issues.apache.org/jira/browse/SPARK-6152
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.2.1
> Environment: Java 8+
> Scala 2.11
>Reporter: Ronald Chen
>Assignee: Josh Rosen
>Priority: Critical
> Fix For: 1.6.0
>
>
> Spark uses reflectasm to check Scala closures which fails if the *user 
> defined Scala closures* are compiled to Java 8 class version
> The cause is reflectasm does not support Java 8
> https://github.com/EsotericSoftware/reflectasm/issues/35
> Workaround:
> Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
> require any Java 8 features
> Stack trace:
> {code}
> java.lang.IllegalArgumentException
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
>   at 
> org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
>   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
>   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
>   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
>   at ...my Scala 2.11 compiled to Java 8 code calling into spark
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-11-10 Thread Josh Rosen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14999190#comment-14999190
 ] 

Josh Rosen commented on SPARK-6152:
---

Does anyone have a standalone reproduction of this issue that I can use to test 
my PR? https://github.com/apache/spark/pull/9512

> Spark does not support Java 8 compiled Scala classes
> 
>
> Key: SPARK-6152
> URL: https://issues.apache.org/jira/browse/SPARK-6152
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.2.1
> Environment: Java 8+
> Scala 2.11
>Reporter: Ronald Chen
>Assignee: Josh Rosen
>Priority: Critical
>
> Spark uses reflectasm to check Scala closures which fails if the *user 
> defined Scala closures* are compiled to Java 8 class version
> The cause is reflectasm does not support Java 8
> https://github.com/EsotericSoftware/reflectasm/issues/35
> Workaround:
> Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
> require any Java 8 features
> Stack trace:
> {code}
> java.lang.IllegalArgumentException
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
>   at 
> org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
>   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
>   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
>   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
>   at ...my Scala 2.11 compiled to Java 8 code calling into spark
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-11-10 Thread Josh Rosen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14999222#comment-14999222
 ] 

Josh Rosen commented on SPARK-6152:
---

Yep, was able to reproduce trivially by running Spark's existing Scala unit 
tests with JDK 8. I'm going to add some plumbing to the build in order to let 
us test this in Jenkins.

> Spark does not support Java 8 compiled Scala classes
> 
>
> Key: SPARK-6152
> URL: https://issues.apache.org/jira/browse/SPARK-6152
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.2.1
> Environment: Java 8+
> Scala 2.11
>Reporter: Ronald Chen
>Assignee: Josh Rosen
>Priority: Critical
>
> Spark uses reflectasm to check Scala closures which fails if the *user 
> defined Scala closures* are compiled to Java 8 class version
> The cause is reflectasm does not support Java 8
> https://github.com/EsotericSoftware/reflectasm/issues/35
> Workaround:
> Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
> require any Java 8 features
> Stack trace:
> {code}
> java.lang.IllegalArgumentException
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
>   at 
> org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
>   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
>   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
>   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
>   at ...my Scala 2.11 compiled to Java 8 code calling into spark
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-11-05 Thread Josh Rosen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14992660#comment-14992660
 ] 

Josh Rosen commented on SPARK-6152:
---

It turns out that Apache Geronimo has already published shaded ASM 5 artifacts:

http://mvnrepository.com/artifact/org.apache.xbean/xbean-asm5-shaded/4.4

Here's the source that was used to produce that shaded artifact:

https://github.com/apache/geronimo-xbean/tree/xbean-4.4/xbean-asm5-shaded

This corresponds to ASM 5.0.4, which is the latest release:

https://github.com/apache/geronimo-xbean/blob/xbean-4.4/pom.xml#L67

I'll investigate changing Spark to use this instead of reflectASM's shaded copy.

> Spark does not support Java 8 compiled Scala classes
> 
>
> Key: SPARK-6152
> URL: https://issues.apache.org/jira/browse/SPARK-6152
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.2.1
> Environment: Java 8+
> Scala 2.11
>Reporter: Ronald Chen
>Priority: Minor
>
> Spark uses reflectasm to check Scala closures which fails if the *user 
> defined Scala closures* are compiled to Java 8 class version
> The cause is reflectasm does not support Java 8
> https://github.com/EsotericSoftware/reflectasm/issues/35
> Workaround:
> Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
> require any Java 8 features
> Stack trace:
> {code}
> java.lang.IllegalArgumentException
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
>   at 
> org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
>   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
>   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
>   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
>   at ...my Scala 2.11 compiled to Java 8 code calling into spark
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-11-05 Thread Josh Rosen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14992639#comment-14992639
 ] 

Josh Rosen commented on SPARK-6152:
---

Do we actually need reflectasm itself for the closure cleaner or are we just 
using it as a convenient way to pull in a shaded ASM artifact? Why not publish 
our own shaded ASM 5 and use that instead?

> Spark does not support Java 8 compiled Scala classes
> 
>
> Key: SPARK-6152
> URL: https://issues.apache.org/jira/browse/SPARK-6152
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.2.1
> Environment: Java 8+
> Scala 2.11
>Reporter: Ronald Chen
>Priority: Minor
>
> Spark uses reflectasm to check Scala closures which fails if the *user 
> defined Scala closures* are compiled to Java 8 class version
> The cause is reflectasm does not support Java 8
> https://github.com/EsotericSoftware/reflectasm/issues/35
> Workaround:
> Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
> require any Java 8 features
> Stack trace:
> {code}
> java.lang.IllegalArgumentException
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
>   at 
> org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
>   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
>   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
>   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
>   at ...my Scala 2.11 compiled to Java 8 code calling into spark
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-11-05 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14992725#comment-14992725
 ] 

Apache Spark commented on SPARK-6152:
-

User 'JoshRosen' has created a pull request for this issue:
https://github.com/apache/spark/pull/9512

> Spark does not support Java 8 compiled Scala classes
> 
>
> Key: SPARK-6152
> URL: https://issues.apache.org/jira/browse/SPARK-6152
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.2.1
> Environment: Java 8+
> Scala 2.11
>Reporter: Ronald Chen
>Assignee: Josh Rosen
>Priority: Critical
>
> Spark uses reflectasm to check Scala closures which fails if the *user 
> defined Scala closures* are compiled to Java 8 class version
> The cause is reflectasm does not support Java 8
> https://github.com/EsotericSoftware/reflectasm/issues/35
> Workaround:
> Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
> require any Java 8 features
> Stack trace:
> {code}
> java.lang.IllegalArgumentException
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
>   at 
> org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
>   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
>   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
>   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
>   at ...my Scala 2.11 compiled to Java 8 code calling into spark
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-09-15 Thread Steve Loughran (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14745278#comment-14745278
 ] 

Steve Loughran commented on SPARK-6152:
---

what changes? Yes, making sure there are no regressions. Hive had to upgrade to 
deal with bugs Kryo 2.21; for Spark 1.5 there's a special 
org.spark-project.hive artifact which downgraded to Kryo 2.21; the subset of 
Hive that spark-hive and spark-hive-thriftserver all work there. Hive would 
certainly veto any reverting to 2.21; I don't know what their stance would be 
to a 3.x upgrade on the 1.2 branch ... reluctant would be the default response, 
I suspect.

Getting everything to 2.24 is more likely, though if 3.x is needed for Java 8 
compatibility it could be argued for

> Spark does not support Java 8 compiled Scala classes
> 
>
> Key: SPARK-6152
> URL: https://issues.apache.org/jira/browse/SPARK-6152
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 1.2.1
> Environment: Java 8+
> Scala 2.11
>Reporter: Ronald Chen
>Priority: Minor
>
> Spark uses reflectasm to check Scala closures which fails if the *user 
> defined Scala closures* are compiled to Java 8 class version
> The cause is reflectasm does not support Java 8
> https://github.com/EsotericSoftware/reflectasm/issues/35
> Workaround:
> Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
> require any Java 8 features
> Stack trace:
> {code}
> java.lang.IllegalArgumentException
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.(Unknown
>  Source)
>   at 
> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
>   at 
> org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
>   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
>   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
>   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
>   at ...my Scala 2.11 compiled to Java 8 code calling into spark
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-08-14 Thread Malcolm Greaves (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14697716#comment-14697716
 ] 

Malcolm Greaves commented on SPARK-6152:


Interesting [~ste...@apache.org]! What kinds of changes do you think this would 
require -- mostly verifying that there's backward compatibility with those 
serialized classes?

 Spark does not support Java 8 compiled Scala classes
 

 Key: SPARK-6152
 URL: https://issues.apache.org/jira/browse/SPARK-6152
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.2.1
 Environment: Java 8+
 Scala 2.11
Reporter: Ronald Chen
Priority: Minor

 Spark uses reflectasm to check Scala closures which fails if the *user 
 defined Scala closures* are compiled to Java 8 class version
 The cause is reflectasm does not support Java 8
 https://github.com/EsotericSoftware/reflectasm/issues/35
 Workaround:
 Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
 require any Java 8 features
 Stack trace:
 {code}
 java.lang.IllegalArgumentException
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
   at ...my Scala 2.11 compiled to Java 8 code calling into spark
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-07-29 Thread Steve Loughran (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14646699#comment-14646699
 ] 

Steve Loughran commented on SPARK-6152:
---

Chill and Kryo need to be in sync; there's also the need to be compatible with 
the version Hive uses, (which has historically been addressed with custom 
versions of Hive).

If spark could jump to Kryo 3.x, classpath conflict with hive would go away, 
provided the wire formats of serialized classes were compatible: hive's 
spark-client JAR uses kryo 2.2.x to talk to spark.

 Spark does not support Java 8 compiled Scala classes
 

 Key: SPARK-6152
 URL: https://issues.apache.org/jira/browse/SPARK-6152
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.2.1
 Environment: Java 8+
 Scala 2.11
Reporter: Ronald Chen
Priority: Minor

 Spark uses reflectasm to check Scala closures which fails if the *user 
 defined Scala closures* are compiled to Java 8 class version
 The cause is reflectasm does not support Java 8
 https://github.com/EsotericSoftware/reflectasm/issues/35
 Workaround:
 Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
 require any Java 8 features
 Stack trace:
 {code}
 java.lang.IllegalArgumentException
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
   at ...my Scala 2.11 compiled to Java 8 code calling into spark
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-06-05 Thread Malcolm Greaves (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14575540#comment-14575540
 ] 

Malcolm Greaves commented on SPARK-6152:


It appears as if progress on updating chill to use a version of reflectasm that 
is compatible with Java 1.8 has stalled: 
https://github.com/twitter/chill/pull/224


 Spark does not support Java 8 compiled Scala classes
 

 Key: SPARK-6152
 URL: https://issues.apache.org/jira/browse/SPARK-6152
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.2.1
 Environment: Java 8+
 Scala 2.11
Reporter: Ronald Chen
Priority: Minor

 Spark uses reflectasm to check Scala closures which fails if the *user 
 defined Scala closures* are compiled to Java 8 class version
 The cause is reflectasm does not support Java 8
 https://github.com/EsotericSoftware/reflectasm/issues/35
 Workaround:
 Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
 require any Java 8 features
 Stack trace:
 {code}
 java.lang.IllegalArgumentException
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
   at ...my Scala 2.11 compiled to Java 8 code calling into spark
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-03-24 Thread Martin Grotzke (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378918#comment-14378918
 ] 

Martin Grotzke commented on SPARK-6152:
---

Btw, we just released kryo 3.0.1: 
https://github.com/EsotericSoftware/kryo/blob/master/CHANGES.md#2240---300-2014-0-04

 Spark does not support Java 8 compiled Scala classes
 

 Key: SPARK-6152
 URL: https://issues.apache.org/jira/browse/SPARK-6152
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.2.1
 Environment: Java 8+
 Scala 2.11
Reporter: Ronald Chen
Priority: Minor

 Spark uses reflectasm to check Scala closures which fails if the *user 
 defined Scala closures* are compiled to Java 8 class version
 The cause is reflectasm does not support Java 8
 https://github.com/EsotericSoftware/reflectasm/issues/35
 Workaround:
 Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
 require any Java 8 features
 Stack trace:
 {code}
 java.lang.IllegalArgumentException
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
   at ...my Scala 2.11 compiled to Java 8 code calling into spark
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-03-20 Thread Martin Grotzke (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371252#comment-14371252
 ] 

Martin Grotzke commented on SPARK-6152:
---

I just released reflectasm-1.10.1 (which now should support java 8 due to the 
upgrade to asm 5) to maven central.

 Spark does not support Java 8 compiled Scala classes
 

 Key: SPARK-6152
 URL: https://issues.apache.org/jira/browse/SPARK-6152
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.2.1
 Environment: Java 8+
 Scala 2.11
Reporter: Ronald Chen
Priority: Minor

 Spark uses reflectasm to check Scala closures which fails if the *user 
 defined Scala closures* are compiled to Java 8 class version
 The cause is reflectasm does not support Java 8
 https://github.com/EsotericSoftware/reflectasm/issues/35
 Workaround:
 Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
 require any Java 8 features
 Stack trace:
 {code}
 java.lang.IllegalArgumentException
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
   at ...my Scala 2.11 compiled to Java 8 code calling into spark
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-03-20 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371254#comment-14371254
 ] 

Sean Owen commented on SPARK-6152:
--

Nice one. It looks like {{reflectasm}} comes in via {{chill}}. Do you know 
if/when {{chill}} might consume the newer version? then we could consume that.

 Spark does not support Java 8 compiled Scala classes
 

 Key: SPARK-6152
 URL: https://issues.apache.org/jira/browse/SPARK-6152
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.2.1
 Environment: Java 8+
 Scala 2.11
Reporter: Ronald Chen
Priority: Minor

 Spark uses reflectasm to check Scala closures which fails if the *user 
 defined Scala closures* are compiled to Java 8 class version
 The cause is reflectasm does not support Java 8
 https://github.com/EsotericSoftware/reflectasm/issues/35
 Workaround:
 Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
 require any Java 8 features
 Stack trace:
 {code}
 java.lang.IllegalArgumentException
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
   at ...my Scala 2.11 compiled to Java 8 code calling into spark
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-03-20 Thread Ronald Chen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371608#comment-14371608
 ] 

Ronald Chen commented on SPARK-6152:


If you want to get the new version via {{chill}}, you will also need to wait 
for a release of {{kryo}} as {{chill}} gets the dependency via {{kryo}}

I think the root cause here is we are using a shaded library in a third party 
dependency.  Sounds like a bad practise.  We should just depend on an unshaded 
reflectasm directly

{{kyro}} just updated to reflectasm-1.10.1 four hours ago, so this dependency 
chain will take a while to arrive.

 Spark does not support Java 8 compiled Scala classes
 

 Key: SPARK-6152
 URL: https://issues.apache.org/jira/browse/SPARK-6152
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.2.1
 Environment: Java 8+
 Scala 2.11
Reporter: Ronald Chen
Priority: Minor

 Spark uses reflectasm to check Scala closures which fails if the *user 
 defined Scala closures* are compiled to Java 8 class version
 The cause is reflectasm does not support Java 8
 https://github.com/EsotericSoftware/reflectasm/issues/35
 Workaround:
 Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
 require any Java 8 features
 Stack trace:
 {code}
 java.lang.IllegalArgumentException
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
   at ...my Scala 2.11 compiled to Java 8 code calling into spark
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-03-20 Thread Martin Grotzke (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371648#comment-14371648
 ] 

Martin Grotzke commented on SPARK-6152:
---

I'll try to get out a new kryo version as soon as possible...

 Spark does not support Java 8 compiled Scala classes
 

 Key: SPARK-6152
 URL: https://issues.apache.org/jira/browse/SPARK-6152
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.2.1
 Environment: Java 8+
 Scala 2.11
Reporter: Ronald Chen
Priority: Minor

 Spark uses reflectasm to check Scala closures which fails if the *user 
 defined Scala closures* are compiled to Java 8 class version
 The cause is reflectasm does not support Java 8
 https://github.com/EsotericSoftware/reflectasm/issues/35
 Workaround:
 Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
 require any Java 8 features
 Stack trace:
 {code}
 java.lang.IllegalArgumentException
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
   at ...my Scala 2.11 compiled to Java 8 code calling into spark
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-03-20 Thread Martin Grotzke (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14371658#comment-14371658
 ] 

Martin Grotzke commented on SPARK-6152:
---

Btw, chill guys are still on kryo 2.21, kryo right now is 3.0.0. IIRC there 
were some compatibility issues in kryo they complained about. Perhaps you 
already should open an issue for chill java 8 support to see what they think 
about it.

 Spark does not support Java 8 compiled Scala classes
 

 Key: SPARK-6152
 URL: https://issues.apache.org/jira/browse/SPARK-6152
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.2.1
 Environment: Java 8+
 Scala 2.11
Reporter: Ronald Chen
Priority: Minor

 Spark uses reflectasm to check Scala closures which fails if the *user 
 defined Scala closures* are compiled to Java 8 class version
 The cause is reflectasm does not support Java 8
 https://github.com/EsotericSoftware/reflectasm/issues/35
 Workaround:
 Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
 require any Java 8 features
 Stack trace:
 {code}
 java.lang.IllegalArgumentException
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
   at ...my Scala 2.11 compiled to Java 8 code calling into spark
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-03-20 Thread Ronald Chen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14372048#comment-14372048
 ] 

Ronald Chen commented on SPARK-6152:


Done: https://github.com/twitter/chill/issues/223

 Spark does not support Java 8 compiled Scala classes
 

 Key: SPARK-6152
 URL: https://issues.apache.org/jira/browse/SPARK-6152
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.2.1
 Environment: Java 8+
 Scala 2.11
Reporter: Ronald Chen
Priority: Minor

 Spark uses reflectasm to check Scala closures which fails if the *user 
 defined Scala closures* are compiled to Java 8 class version
 The cause is reflectasm does not support Java 8
 https://github.com/EsotericSoftware/reflectasm/issues/35
 Workaround:
 Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
 require any Java 8 features
 Stack trace:
 {code}
 java.lang.IllegalArgumentException
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
   at ...my Scala 2.11 compiled to Java 8 code calling into spark
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-03-18 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14368242#comment-14368242
 ] 

Sean Owen commented on SPARK-6152:
--

To your deleted comment -- yes indeed it looks like the library explicitly does 
not work with Java 8 since class file version 52 == Java 8

 Spark does not support Java 8 compiled Scala classes
 

 Key: SPARK-6152
 URL: https://issues.apache.org/jira/browse/SPARK-6152
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.2.1
 Environment: Java 8+
 Scala 2.11
Reporter: Ronald Chen
Priority: Minor

 Spark uses reflectasm to check Scala closures which fails if the *user 
 defined Scala closures* are compiled to Java 8 class version
 The cause is reflectasm does not support Java 8
 https://github.com/EsotericSoftware/reflectasm/issues/35
 Workaround:
 Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
 require any Java 8 features
 Stack trace:
 {code}
 java.lang.IllegalArgumentException
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
   at ...my Scala 2.11 compiled to Java 8 code calling into spark
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-03-18 Thread Jonathan Neufeld (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14368138#comment-14368138
 ] 

Jonathan Neufeld commented on SPARK-6152:
-

The exception is raised in 
com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader

It looks like it's this line:
if(this.readShort(6)  51) {
throw new IllegalArgumentException();

Which bears the mark of a compiled class file version compatibility problem to 
me.

 Spark does not support Java 8 compiled Scala classes
 

 Key: SPARK-6152
 URL: https://issues.apache.org/jira/browse/SPARK-6152
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.2.1
 Environment: Java 8+
 Scala 2.11
Reporter: Ronald Chen
Priority: Minor

 Spark uses reflectasm to check Scala closures which fails if the *user 
 defined Scala closures* are compiled to Java 8 class version
 The cause is reflectasm does not support Java 8
 https://github.com/EsotericSoftware/reflectasm/issues/35
 Workaround:
 Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
 require any Java 8 features
 Stack trace:
 {code}
 java.lang.IllegalArgumentException
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.init(Unknown
  Source)
   at 
 org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
   at 
 org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
   at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
   at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
   at org.apache.spark.rdd.RDD.map(RDD.scala:288)
   at ...my Scala 2.11 compiled to Java 8 code calling into spark
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-03-04 Thread Ronald Chen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14347220#comment-14347220
 ] 

Ronald Chen commented on SPARK-6152:


I've made the description more clear.  The problem is occurs when I create a 
Scala 2.11 project that imports Spark 1.2.1 and I compile my Scala code to Java 
8 using scalac -target:jvm-1.8 ...

 Spark does not support Java 8 compiled Scala classes
 

 Key: SPARK-6152
 URL: https://issues.apache.org/jira/browse/SPARK-6152
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.2.1
 Environment: Java 8+
 Scala 2.11
Reporter: Ronald Chen
Priority: Minor

 Spark uses reflectasm to check Scala closures which fails if the *user 
 defined Scala closures* are compiled to Java 8 class version
 The cause is reflectasm does not support Java 8
 https://github.com/EsotericSoftware/reflectasm/issues/35
 Workaround:
 Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
 require any Java 8 features



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

2015-03-04 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14346650#comment-14346650
 ] 

Sean Owen commented on SPARK-6152:
--

Spark is not compiled with Java 8. What problem are you reporting?

 Spark does not support Java 8 compiled Scala classes
 

 Key: SPARK-6152
 URL: https://issues.apache.org/jira/browse/SPARK-6152
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 1.2.1
 Environment: Java 8+
 Scala 2.11
Reporter: Ronald Chen
Priority: Minor

 Spark uses reflectasm to check Scala closures which fails if the Scala classs 
 is compiled to Java 8 class version
 The cause is reflectasm does not support Java 8
 https://github.com/EsotericSoftware/reflectasm/issues/35
 Workaround:
 Don't compile Scala classes to Java 8, Scala 2.11 does not support nor 
 require any Java 8 features



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org