(Replying on new Spark mailing list since the old one closed).

Are you sure Spark is finding your build of Mesos instead of the Apache one 
from Maven Central? Unfortunately, code compiled with different protobuf 
versions is not compatible, because the generated code by the protoc compiler 
changes (even though the protobufs themselves can be read across versions). 
Since the published Mesos still has protobuf 2.4, you may be getting that 
somehow.

If this doesn’t work, I’d suggest asking on the Mesos mailing list actually. 
They would have a better sense of why this happens, especially if you give them 
your build options.

Matei

On Jan 2, 2014, at 12:01 PM, Damien Dubé <damien.d...@gmail.com> wrote:

> I've tried it bulding spark with mesos 0.14.2 and I have the exact same error
> 
> Stack: [0x00007f82f5849000,0x00007f82f594a000],  sp=0x00007f82f59485d0,  free 
> space=1021k
> Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native 
> code)
> V  [libjvm.so+0x632d09]  jni_GetByteArrayElements+0x89
> C  [libmesos-0.14.2.so+0x5e08b9]  mesos::FrameworkInfo 
> construct<mesos::FrameworkInfo>(JNIEnv_*, _jobject*)+0x79
> 
> Java frames: (J=compiled Java code, j=interpreted, Vv=VM code)
> j  org.apache.mesos.MesosSchedulerDriver.initialize()V+0
> j  
> org.apache.mesos.MesosSchedulerDriver.<init>(Lorg/apache/mesos/Scheduler;Lorg/apache/mesos/Protos$FrameworkInfo;Ljava/lang/String;)V+62
> j  
> org.apache.spark.scheduler.cluster.mesos.MesosSchedulerBackend$$anon$1.run()V+44
> v  ~StubRoutines::call_stub
> 
> 
> On Monday, December 30, 2013 7:18:57 PM UTC-5, Jey Kottalam wrote:
> It looks like your Spark is built against mesos 0.13.0 according to the 
> stacktrace.  You may need to rebuild Spark to link with your custom build of 
> Mesos 0.14.2.
> 
> -Jey
> 
> 
> On Mon, Dec 30, 2013 at 1:39 PM, Damien Dubé <damie...@gmail.com> wrote:
> Once I have my mesos cluster up and running. My spark job is always returning 
> the same error. I have tried multiple options but I am still having the same 
> errror.
> 
> Here is the stack trace
> 
> Stack: [0x00007f41ea4c1000,0x00007f41ea5c2000],  sp=0x00007f41ea5c0670,  free 
> space=1021k
> Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native 
> code)
> V  [libjvm.so+0x632d09]  jni_GetByteArrayElements+0x89
> C  [libmesos-0.13.0.so+0x5a4559]  mesos::FrameworkInfo 
> construct<mesos::FrameworkInfo>(JNIEnv_*, _jobject*)+0x79
> C  0x00007f420c0798a8
> j  
> org.apache.spark.scheduler.cluster.mesos.MesosSchedulerBackend$$anon$1.run()V+44
> v  ~StubRoutines::call_stub
> V  [libjvm.so+0x5f8485]  JavaCalls::call_helper(JavaValue*, methodHandle*, 
> JavaCallArguments*, Thread*)+0x365
> V  [libjvm.so+0x5f6ee8]  JavaCalls::call(JavaValue*, methodHandle, 
> JavaCallArguments*, Thread*)+0x28
> V  [libjvm.so+0x5f71b7]  JavaCalls::call_virtual(JavaValue*, KlassHandle, 
> Symbol*, Symbol*, JavaCallArguments*, Thread*)+0x197
> V  [libjvm.so+0x5f72d7]  JavaCalls::call_virtual(JavaValue*, Handle, 
> KlassHandle, Symbol*, Symbol*, Thread*)+0x47
> V  [libjvm.so+0x6731e5]  thread_entry(JavaThread*, Thread*)+0xe5
> V  [libjvm.so+0x94d38f]  JavaThread::thread_main_inner()+0xdf
> V  [libjvm.so+0x94d495]  JavaThread::run()+0xf5
> V  [libjvm.so+0x815288]  java_start(Thread*)+0x108
> 
> 
> What I am trying to have
> 
> Spark 0.8.1 
> Mesos 0.14.2
> HDFS 2.2.0 (I do not care about yarn or hadoop mapred since I am using mesos).
> Oracle Java 1.7.0-45
> 
> Here are the 4 options I have tried for spark.
> 
> SPARK_HADOOP_VERSION=2.2.0 sbt/sbt assembly
> and
> SPARK_HADOOP_VERSION=2.2.0 SPARK_YARN=true sbt/sbt assembly
> 
> then
> 
> make-distribution.sh --hadoop 2.2.0 --with-yarn
> and 
> make-distribution.sh --hadoop 2.2.0
> 
> 
> 
> Since all of those options are built with protobuf 2.5.0
> 
> I've rebuild mesos 0.14.2 using protobuf 2.5.0
> 
> The error I am having still seems to be related to protobuf seriously do not 
> know how to try to debug that. All my modules are now using protobuf 2.5.0.
> 
> 
> Any ideas?
> 
> 
> -- 
> You received this message because you are subscribed to the Google Groups 
> "Spark Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to spark-users...@googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.
> 
> 
> -- 
> You received this message because you are subscribed to the Google Groups 
> "Spark Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to spark-users+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to