[ 
https://issues.apache.org/jira/browse/SPARK-19675?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15877255#comment-15877255
 ] 

Shixiong Zhu commented on SPARK-19675:
--------------------------------------

[~taroplus] Yeah, I should have checked `sbt run` with myself. Looks like it 
does use some class loader magic to support multiple Scala versions.

However, I don't see how to do this in Spark. In general, the recommended way 
to run Spark application is using spark-submit rather than `sbt run`.

I checked the local mode and it works. So the issue happens when launching 
executor processes in local-cluster, client and cluster modes.

There are couples problems to support `sbt run`. E.g.,

- How to know if the driver is using `sbt run`?
- How to launch a new executor process with the SBT ClassLoader automatically 
in a remote node? The remote node may not have SBT installed.

This is really a low priority feature. Welcome to submit a PR if you have time 
to work on this.

> ExecutorClassLoader loads classes from SystemClassLoader
> --------------------------------------------------------
>
>                 Key: SPARK-19675
>                 URL: https://issues.apache.org/jira/browse/SPARK-19675
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.1.0, 2.2.0
>         Environment: sbt / Play Framework
>            Reporter: Kohki Nishio
>            Priority: Minor
>
> Spark Executor loads classes from SystemClassLoader which contains 
> sbt-launch.jar and it contains Scala2.10 binary, however Spark itself is 
> built on Scala2.11, thus it's throwing InvalidClassException
> java.io.InvalidClassException: scala.Option; local class incompatible: stream 
> classdesc serialVersionUID = -114498752079829388, local class 
> serialVersionUID = 5081326844987135632
>       at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
>       at 
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
>       at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
>       at 
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
>       at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
>       at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
>       at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
>       at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
>       at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> ExecutorClassLoader's desired class loder (parentLoader) actually contains 
> the correct path (scala-library-2.11.8.jar) but it is not being used.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to