Hello, I'm writing a Play Framework application which does Spark, however I'm getting below
java.io.InvalidClassException: scala.Option; local class incompatible: stream classdesc serialVersionUID = -114498752079829388, local class serialVersionUID = 5081326844987135632 at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808) It's because I'm launching the application via SBT and sbt-launch.jar contains Scala 2.10 binary. However my Spark binary is for 2.11 that's why I'm getting this. I believe ExecutorClassLoader needs to override loadClass method as well, can anyone comment on this ? It's picking up Option class from system classloader. Thanks -- Kohki Nishio