Thanks,
Mesos will show spark is driver is running, but what happened if my batch
job finished? How can I reschedule without chronos ? Can I submit a job
without start it?

Thanks

b0c1

----------------------------------------------------------------------------------------------------------------------------------
Skype: boci13, Hangout: boci.b...@gmail.com

On Fri, Jul 24, 2015 at 11:52 PM, Dean Wampler <deanwamp...@gmail.com>
wrote:

> When running Spark in Mesos cluster mode, the driver program runs in one
> of the cluster nodes, like the other Spark processes that are spawned. You
> won't need a special node for this purpose. I'm not very familiar with
> Chronos, but its UI or the regular Mesos UI should show you where the
> driver is running, then you can use the Spark web UI on that machine to see
> what the Spark job is doing.
>
> dean
>
> Dean Wampler, Ph.D.
> Author: Programming Scala, 2nd Edition
> <http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
> Typesafe <http://typesafe.com>
> @deanwampler <http://twitter.com/deanwampler>
> http://polyglotprogramming.com
>
> On Fri, Jul 24, 2015 at 4:47 PM, boci <boci.b...@gmail.com> wrote:
>
>> Thanks, but something is not clear...
>> I have the mesos cluster.
>> - I want to submit my application and scheduled with chronos.
>> - For cluster mode I need a dispatcher, this is another container
>> (machine in the real world)? What will this do? It's needed when I using
>> chronos?
>> - How can I access to my spark job from chronos?
>>
>> I think submit in client mode is not fit to my condition, that's right?
>>
>> Thanks
>> b0c1
>>
>>
>> ----------------------------------------------------------------------------------------------------------------------------------
>> Skype: boci13, Hangout: boci.b...@gmail.com
>>
>> On Wed, Jul 22, 2015 at 4:51 PM, Dean Wampler <deanwamp...@gmail.com>
>> wrote:
>>
>>> This page, http://spark.apache.org/docs/latest/running-on-mesos.html,
>>> covers many of these questions. If you submit a job with the option
>>> "--supervise", it will be restarted if it fails.
>>>
>>> You can use Chronos for scheduling. You can create a single streaming
>>> job with a 10 minute batch interval, if that works for your every 10-min.
>>> need.
>>>
>>> Dean Wampler, Ph.D.
>>> Author: Programming Scala, 2nd Edition
>>> <http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
>>> Typesafe <http://typesafe.com>
>>> @deanwampler <http://twitter.com/deanwampler>
>>> http://polyglotprogramming.com
>>>
>>> On Wed, Jul 22, 2015 at 3:53 AM, boci <boci.b...@gmail.com> wrote:
>>>
>>>> Hi guys!
>>>>
>>>> I'm a new in mesos. I have two spark application (one streaming and one
>>>> batch). I want to run both app in mesos cluster. Now for testing I want to
>>>> run in docker container so I started a simple redjack/mesos-master, but I
>>>> think a lot of think unclear for me (both mesos and spark-mesos).
>>>>
>>>> If I have a mesos cluster (for testing it will be some docker
>>>> container) i need a separate machine (container) to run my spark job? Or
>>>> can I submit the cluster and schedule (chronos or I dunno)?
>>>> How can I run the streaming job? What happened if the "controller"
>>>> died? Or if I call spark-submit with master=mesos my application started
>>>> and I can forget? How can I run in every 10 min without submit in every 10
>>>> min? How can I run my streaming app in HA mode?
>>>>
>>>> Thanks
>>>>
>>>> b0c1
>>>>
>>>>
>>>> ----------------------------------------------------------------------------------------------------------------------------------
>>>> Skype: boci13, Hangout: boci.b...@gmail.com
>>>>
>>>
>>>
>>
>

Reply via email to