Hello,

I have been using Externalizer from Chill to as serialization wrapper. It
appears to me that Spark have some conflict with the classloader with
Chill. I have the (a simplified version) following program:

import java.io._
import com.twitter.chill.Externalizer

class X(val i: Int) { override def toString() = s"X(i=$i)" }

object SimpleApp {
  def main(args: Array[String]) {
    val bos = new ByteArrayOutputStream(100000)
    val oos = new ObjectOutputStream(bos)
    oos.writeObject(Externalizer(new X(10)))
    oos.close()

    val ois = new ObjectInputStream(new
ByteArrayInputStream(bos.toByteArray))
    val y = ois.readObject.asInstanceOf[Externalizer[X]]
    println(y.get)
  }
}

When I run it as a normal program (i.e. "sbt run"), the program runs fine.

But when I run it with spark-submit (i.e. "spark-submit --verbose --class
"SimpleApp" --master local[4]
target/scala-2.10/simple-project_2.10-1.0.jar "), the program fails at
"ois.readObject" call. I got an error that Kryo fails to find the class "X".

Exception in thread "main" com.esotericsoftware.kryo.KryoException: Unable
to find class: X
at
com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:138)
at
com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:115)
at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:610)
at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:721)

I guess the issue is that Spark has a magic classloader, and Kryo fails to
see the same classpaths.

Is there anyway to remedy this issue?

Thanks.

Justin

Reply via email to