[ 
https://issues.apache.org/jira/browse/SPARK-2691?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14204764#comment-14204764
 ] 

Chris Heller edited comment on SPARK-2691 at 11/10/14 1:28 PM:
---------------------------------------------------------------

Just an update. I've been working on this patch over on GitHub 
(https://github.com/apache/spark/pull/3074), and have added support for Docker 
in both coarse and fine mode -- as well as the ability to map ports.

[~tarnfeld] {{spark.executorEnv}} was already available from the Spark 
configuration properties prior to my fixes.




was (Author: chrisheller):
Just an update. I've been working on this patch over on GitHub 
(https://github.com/apache/spark/pull/3074), and have added support for Docker 
in both coarse and fine mode -- as well as the ability to map ports.

[~tarnfeld] {spark.executorEnv} was already available from the Spark 
configuration properties prior to my fixes.



> Allow Spark on Mesos to be launched with Docker
> -----------------------------------------------
>
>                 Key: SPARK-2691
>                 URL: https://issues.apache.org/jira/browse/SPARK-2691
>             Project: Spark
>          Issue Type: Improvement
>          Components: Mesos
>            Reporter: Timothy Chen
>            Assignee: Timothy Chen
>              Labels: mesos
>         Attachments: spark-docker.patch
>
>
> Currently to launch Spark with Mesos one must upload a tarball and specifiy 
> the executor URI to be passed in that is to be downloaded on each slave or 
> even each execution depending coarse mode or not.
> We want to make Spark able to support launching Executors via a Docker image 
> that utilizes the recent Docker and Mesos integration work. 
> With the recent integration Spark can simply specify a Docker image and 
> options that is needed and it should continue to work as-is.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to