[ 
https://issues.apache.org/jira/browse/SPARK-4977?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14259480#comment-14259480
 ] 

Shivaram Venkataraman commented on SPARK-4977:
----------------------------------------------

I've run into this before too, but its not very easy to fix. The reason most 
conf files get overwritten is that hostnames change on EC2 when machines are 
stopped and started, and we need to update the hostnames in the config files. I 
guess there are a couple of solutions I can think of

1. To provide an extension-like mechanism where we source script which contains 
user-defined options (like spark-env-extensions.sh) and we don't overwrite this 
file during start / stop. 
2.  To separate out conf files which need hostname changes vs. those that don't 
and only overwrrite the former. This will need changes to `deploy_templates.py` 
in our current setup.

> spark-ec2 start resets all the spark/conf configurations
> --------------------------------------------------------
>
>                 Key: SPARK-4977
>                 URL: https://issues.apache.org/jira/browse/SPARK-4977
>             Project: Spark
>          Issue Type: Bug
>          Components: EC2
>    Affects Versions: 1.2.0
>            Reporter: Noah Young
>            Priority: Minor
>
> Running `spark-ec2 start` to restart an already-launched cluster causes the 
> cluster setup scripts to be run, which reset any existing spark configuration 
> files on the remote machines. The expected behavior is that all the modules 
> (tachyon, hadoop, spark itself) should be restarted, and perhaps the master 
> configuration copy-dir'd out, but anything in spark/conf should (at least 
> optionally) be left alone.
> As far as I know, one must create and execute their own init script to set 
> all spark configurables as needed after restarting a cluster.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to