This shouldn't be a chicken-and-egg problem, since the script fetches the AMI 
from a known URL. Seems like an issue in publishing this release.

On August 26, 2014 at 1:24:45 PM, Shivaram Venkataraman 
(shiva...@eecs.berkeley.edu) wrote:

This is a chicken and egg problem in some sense. We can't change the ec2  
script till we have made the release and uploaded the binaries -- But once  
that is done, we can't update the script.  

I think the model we support so far is that you can launch the latest  
spark version from the master branch on github. I guess we can try to add  
something in the release process that updates the script but doesn't commit  
it ? The release managers might be able to add more.  

Thanks  
Shivaram  


On Tue, Aug 26, 2014 at 1:16 PM, Nicholas Chammas <  
nicholas.cham...@gmail.com> wrote:  

> I downloaded the source code release for 1.0.2 from here  
> <http://spark.apache.org/downloads.html> and launched an EC2 cluster using  
> spark-ec2.  
>  
> After the cluster finishes launching, I fire up the shell and check the  
> version:  
>  
> scala> sc.version  
> res1: String = 1.0.1  
>  
> The startup banner also shows the same thing. Hmm...  
>  
> So I dig around and find that the spark_ec2.py script has the default Spark  
> version set to 1.0.1.  
>  
> Derp.  
>  
> parser.add_option("-v", "--spark-version", default="1.0.1",  
> help="Version of Spark to use: 'X.Y.Z' or a specific git hash")  
>  
> Is there any way to fix the release? It’s a minor issue, but could be very  
> confusing. And how can we prevent this from happening again?  
>  
> Nick  
> ​  
>  

Reply via email to