As Tim suggested: spark.mesos.executor.docker.image is your friend.









Kind regards,

Radek Gruchalski

[email protected] (mailto:[email protected])
 
(mailto:[email protected])
de.linkedin.com/in/radgruchalski/ (http://de.linkedin.com/in/radgruchalski/)

Confidentiality:
This communication is intended for the above-named person and may be 
confidential and/or legally privileged.
If it has come to you in error you must take no action based on it, nor must 
you copy or show it to anyone; please delete/destroy and inform the sender 
immediately.



On Tuesday, 15 March 2016 at 17:23, Pradeep Chhetri wrote:

> Hello Radoslaw,
>  
> Thank you for the quick reply. Few questions:
>  
> 1) Do you mean mounting spark artifacts as a volume on each mesos agent node? 
>  This means number of volumes = number of mesos agents.
>  
> 2) Since I am not using HDFS at all, that is definitely not an option for me.
>  
> Isn't there a way to just launch the spark tasks also as docker containers 
> which are self contained with spark artifacts ?
>  
> Thanks.
>  
> On Tue, Mar 15, 2016 at 3:49 PM, Radoslaw Gruchalski <[email protected] 
> (mailto:[email protected])> wrote:
> > Pradeep,
> >  
> > You can mount a spark directory as a volume. This means you have to have 
> > spark deployed on every agent.
> >  
> > Another thing you can do, place spark in hdfs, assuming that you have hdfs 
> > available but that too will download a copy to the sandbox.
> >  
> > I'd prefer the former.
> >  
> > Sent from Outlook Mobile (https://aka.ms/qtex0l)
> > _____________________________
> > From: Pradeep Chhetri <[email protected] 
> > (mailto:[email protected])>
> > Sent: Tuesday, March 15, 2016 4:41 pm
> > Subject: Apache Spark Over Mesos
> > To: <[email protected] (mailto:[email protected])>
> >  
> >  
> >  
> > Hello,  
> >  
> > I am able to run Apache Spark over Mesos. Its quite simple to run Spark 
> > Dispatcher over marathon and ask it to run Spark Executor (I guess also can 
> > be called as Spark Driver) as docker container.  
> >  
> > I have a query regarding this:  
> >  
> > All spark tasks are spawned directly by first downloading the spark 
> > artifacts. I was thinking if there is some way I can start them too as 
> > docker containers. This will save the time for downloading the spark 
> > artifacts. I am running spark in fine-grained mode.  
> >  
> > I have attached a screenshot of a sample job  
> >  
> >  
> > ​  
> > Thanks,  
> >  
> > --  
> > Pradeep Chhetri  
> >  
>  
>  
>  
> --  
> Pradeep Chhetri  

Reply via email to