Pradeep,
You can mount a spark directory as a volume. This means you have to have spark
deployed on every agent.
Another thing you can do, place spark in hdfs, assuming that you have hdfs
available but that too will download a copy to the sandbox.
I'd prefer the former.
Sent from Outlook Mobile
_____________________________
From: Pradeep Chhetri <[email protected]>
Sent: Tuesday, March 15, 2016 4:41 pm
Subject: Apache Spark Over Mesos
To: <[email protected]>
Hello,
I am able to run Apache Spark over Mesos. Its quite simple to run
Spark Dispatcher over marathon and ask it to run Spark Executor (I guess also
can be called as Spark Driver) as docker container.
I have a query regarding this:
All spark tasks are spawned directly by first downloading the spark
artifacts. I was thinking if there is some way I can start them too as docker
containers. This will save the time for downloading the spark artifacts. I am
running spark in fine-grained mode.
I have attached a screenshot of a sample job
Thanks,
--
Pradeep Chhetri