Hello,

Is it possible to use a custom class as my spark's KryoSerializer running
under Mesos?

I've tried adding my jar containing the class to my spark context (via
SparkConf.addJars), but I always get:

java.lang.ClassNotFoundException: flambo.kryo.FlamboKryoSerializer
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:270)
        at 
org.apache.spark.serializer.SerializerManager.get(SerializerManager.scala:56)
        at 
org.apache.spark.serializer.SerializerManager.setDefault(SerializerManager.scala:38)
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:146)
        at org.apache.spark.executor.Executor.<init>(Executor.scala:110)
        at 
org.apache.spark.executor.MesosExecutorBackend.registered(MesosExecutorBackend.scala:58)
Exception in thread "Thread-0"


Do I need to include this jar containing my serializer class in my
make-distribution executor tgz or something?

Thanks

Reply via email to