Hello,

I am able to run Apache Spark over Mesos. Its quite simple to run Spark
Dispatcher over marathon and ask it to run Spark Executor (I guess also can
be called as Spark Driver) as docker container.

I have a query regarding this:

All spark tasks are spawned directly by first downloading the spark
artifacts. I was thinking if there is some way I can start them too as
docker containers. This will save the time for downloading the spark
artifacts. I am running spark in fine-grained mode.

I have attached a screenshot of a sample job


​
Thanks,

-- 
Pradeep Chhetri

Reply via email to