Okay I just went ahead and fixed this to make it backwards-compatible
(was a simple fix). I launched a cluster successfully with Spark
0.8.1.

Jeremy - if you could try again and let me know if there are any
issues, that would be great. Thanks again for reporting this.

On Sun, May 4, 2014 at 3:41 PM, Patrick Wendell <pwend...@gmail.com> wrote:
> Hey Jeremy,
>
> This is actually a big problem - thanks for reporting it, I'm going to
> revert this change until we can make sure it is backwards compatible.
>
> - Patrick
>
> On Sun, May 4, 2014 at 2:00 PM, Jeremy Freeman <freeman.jer...@gmail.com> 
> wrote:
>> Hi all,
>>
>> A heads up in case others hit this and are confused...  This nice  addition
>> <https://github.com/apache/spark/pull/612>   causes an error if running the
>> spark-ec2.py deploy script from a version other than master (e.g. 0.8.0).
>>
>> The error occurs during launch, here:
>>
>> ...
>> Creating local config files...
>> configuring /etc/ganglia/gmond.conf
>> Traceback (most recent call last):
>>   File "./deploy_templates.py", line 89, in <module>
>>     text = text.replace("{{" + key + "}}", template_vars[key])
>> TypeError: expected a character buffer object
>> Deploying Spark config files...
>> chmod: cannot access `/root/spark/conf/spark-env.sh': No such file or
>> directory
>> ...
>>
>> And then several more errors because of missing variables (though the
>> cluster itself launches, there are several configuration problems, e.g. with
>> HDFS).
>>
>> deploy_templates fails because the new SPARK_MASTER_OPTS and
>> SPARK_WORKER_INSTANCES don't exist, and earlier versions of spark-ec2.py
>> still use deploy_templates from https://github.com/mesos/spark-ec2.git -b
>> v2, which has the new variables.
>>
>> Using the updated spark-ec2.py from master works fine.
>>
>> -- Jeremy
>>
>> ---------------------
>> Jeremy Freeman, PhD
>> Neuroscientist
>> @thefreemanlab
>>
>>
>>
>> --
>> View this message in context: 
>> http://apache-spark-user-list.1001560.n3.nabble.com/spark-ec2-error-tp5323.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to