Hi, Tim,
  Thanks, but I found after I finish a spark job successfully, there is
still not any info about this job on Mesos UI under "Completed Tasks", so
no way to see if Spark is really active on Mesos. My test job:
$./bin/run-example SparkPi
...
Pi is roughly 3.14304
...

Maybe the way to run the above SparkPi is wrong? How to run it on Mesos?

> export MESOS_NATIVE_LIBRARY=/home/ubuntu/mesos-0.21.0/build/lib/
libmesos-0.21.0.so
> export
PROTOBUF_JAR=/home/ubuntu/hadoop-2.5.0-cdh5.2.0/protobuf-java-2.5.0.jar
> export MESOS_JAR=/home/ubuntu/hadoop-2.5.0-cdh5.2.0//mesos-0.21.0.jar
> export
SPARK_EXECUTOR_URI=hdfs://clus-1:9000/user/ubuntu/spark-1.1.1-bin-hadoop2.4_mesos.tar.gz
> export MASTER=mesos://clus-1:5050/mesos

Cheers,
Dan

2015-02-27 16:34 GMT-06:00 Tim Chen <[email protected]>:

> Hi Dan,
>
> You won't see active frameworks happening until you start running a Spark
> job. This is because each Spark job actually launches a new Spark framework
> that is scheduling for that single job.
>
> Tim
>
> On Fri, Feb 27, 2015 at 1:39 PM, Dan Dong <[email protected]> wrote:
>
>> Hi, Dick,
>>   By Spark daemons I mean Master and Worker process running on master and
>> slaves respectively when you run sbin/start-master.sh and
>> ./sbin/start-slaves.sh.
>>
>> Cheers,
>> Dan
>>
>>
>> 2015-02-27 15:02 GMT-06:00 Dick Davies <[email protected]>:
>>
>> What do you mean by spark daemons?
>>>
>>> the spark shell (or any other spark application) acts as a Mesos
>>> framework,
>>> so until that's running spark isn't 'on' Mesos.
>>>
>>> On 27 February 2015 at 16:23, Dan Dong <[email protected]> wrote:
>>> > Hi, All,
>>> >   When I configured and started spark daemons, why I could not see it
>>> in
>>> > "Active Frameworks" on Mesos UI(for Hadoop, it will show up
>>> immediately when
>>> > hadoop daemons started)?. I can see the spark framework on mesos UI
>>> only
>>> > when I run the spark-shell command interactively. Is it normal?
>>> >
>>> > export
>>> > MESOS_NATIVE_LIBRARY=/home/ubuntu/mesos-0.21.0/build/lib/
>>> libmesos-0.21.0.so
>>> > export
>>> > PROTOBUF_JAR=/home/ubuntu/hadoop-2.5.0-cdh5.2.0/protobuf-java-2.5.0.jar
>>> > export MESOS_JAR=/home/ubuntu/hadoop-2.5.0-cdh5.2.0//mesos-0.21.0.jar
>>> > export
>>> >
>>> SPARK_EXECUTOR_URI=hdfs://clus-1:9000/user/ubuntu/spark-1.1.1-bin-hadoop2.4_mesos.tar.gz
>>> > export MASTER=mesos://clus-1:5050/mesos
>>> >
>>> >
>>> > Cheers,
>>> > Dan
>>> >
>>>
>>
>>
>

Reply via email to