[ https://issues.apache.org/jira/browse/SPARK-2691?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14194632#comment-14194632 ]
Chris Heller commented on SPARK-2691: ------------------------------------- That seems reasonable. In fact, the volumes field of a ContainerInfo is not part of the DockerInfo structure, but since there is only a DOCKER type of ContainerInfo at the moment, and since the volumes field is described perfectly by the 'docker run -v' syntax, it seems OK to repurpose it here. > Allow Spark on Mesos to be launched with Docker > ----------------------------------------------- > > Key: SPARK-2691 > URL: https://issues.apache.org/jira/browse/SPARK-2691 > Project: Spark > Issue Type: Improvement > Components: Mesos > Reporter: Timothy Chen > Assignee: Timothy Chen > Labels: mesos > Attachments: spark-docker.patch > > > Currently to launch Spark with Mesos one must upload a tarball and specifiy > the executor URI to be passed in that is to be downloaded on each slave or > even each execution depending coarse mode or not. > We want to make Spark able to support launching Executors via a Docker image > that utilizes the recent Docker and Mesos integration work. > With the recent integration Spark can simply specify a Docker image and > options that is needed and it should continue to work as-is. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org