[ 
https://issues.apache.org/jira/browse/SPARK-3821?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15292488#comment-15292488
 ] 

Mete Kural commented on SPARK-3821:
-----------------------------------

Thank you for the response Nicholas. spark-ec2 does take care of AMIs for ec2 
and in fact is documented in Spark documentation as a deployment method along 
with distribution with Spark. However, the same level of presence doesn't seem 
to exist for Docker as a deployment method. What's inside the docker folder in 
Spark is not really in shape for a production deployment, not documented in 
Spark documentation either, and doesn't seem to have been worked on in quite a 
while. It seems the only way the Spark project officially supports running 
Spark on Docker is via Mesos, would you say that is correct? With Docker 
becoming an industry standard as of a month ago, I hope there will be renewed 
interest within the Spark project in supporting Docker as an official 
deployment method without the Mesos requirement.

> Develop an automated way of creating Spark images (AMI, Docker, and others)
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-3821
>                 URL: https://issues.apache.org/jira/browse/SPARK-3821
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build, EC2
>            Reporter: Nicholas Chammas
>            Assignee: Nicholas Chammas
>         Attachments: packer-proposal.html
>
>
> Right now the creation of Spark AMIs or Docker containers is done manually. 
> With tools like [Packer|http://www.packer.io/], we should be able to automate 
> this work, and do so in such a way that multiple types of machine images can 
> be created from a single template.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to