I am trying to do the same but till now no luck...
I have everything running inside docker container including mesos master, mesos 
slave , marathon , spark mesos cluster dispatcher.

But when I try to submit the job using spark submit as a docker container it 
fails ...

Between this setup is on single centos 7 machine.

Let me know if you have any insights as how to make the submit work.

Ashish

Sent from my iPad

> On Mar 5, 2016, at 3:33 AM, Tamas Szuromi <tamas.szur...@odigeo.com.INVALID> 
> wrote:
> 
> Hi, Have a look at on http://spark.apache.org/docs/latest/configuration.html 
> what ports need to be exposed. With mesos we had a lot of problems with 
> container networking but yes the --net=host is a shortcut.
> 
> Tamas
> 
> 
> 
>> On 4 March 2016 at 22:37, yanlin wang <yanl...@me.com> wrote:
>> We would like to run multiple spark driver in docker container. Any 
>> suggestion for the port expose and network settings for docker so driver is 
>> reachable by the worker nodes? —net=“hosts” is the last thing we want to do.
>> 
>> Thx
>> Yanlin
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>> 
> 

Reply via email to