Repository: spark
Updated Branches:
  refs/heads/master 5cd11d51c -> 7b4203ab4


Add Spark v0.9.1 to ec2 launch script and use it as the default

Mainly ported from branch-0.9.

Author: Harvey Feng <hyfeng...@gmail.com>

Closes #385 from harveyfeng/0.9.1-ec2 and squashes the following commits:

769ac2f [Harvey Feng] Add Spark v0.9.1 to ec2 launch script and use it as the 
default


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/7b4203ab
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/7b4203ab
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/7b4203ab

Branch: refs/heads/master
Commit: 7b4203ab4c640f7875ae3536228ed4d791062017
Parents: 5cd11d5
Author: Harvey Feng <hyfeng...@gmail.com>
Authored: Thu Apr 10 18:25:54 2014 -0700
Committer: Patrick Wendell <pwend...@gmail.com>
Committed: Thu Apr 10 18:25:54 2014 -0700

----------------------------------------------------------------------
 ec2/spark_ec2.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/7b4203ab/ec2/spark_ec2.py
----------------------------------------------------------------------
diff --git a/ec2/spark_ec2.py b/ec2/spark_ec2.py
index d8840c9..31209a6 100755
--- a/ec2/spark_ec2.py
+++ b/ec2/spark_ec2.py
@@ -70,7 +70,7 @@ def parse_args():
            "slaves across multiple (an additional $0.01/Gb for bandwidth" +
            "between zones applies)")
   parser.add_option("-a", "--ami", help="Amazon Machine Image ID to use")
-  parser.add_option("-v", "--spark-version", default="0.9.0",
+  parser.add_option("-v", "--spark-version", default="0.9.1",
       help="Version of Spark to use: 'X.Y.Z' or a specific git hash")
   parser.add_option("--spark-git-repo",
       default="https://github.com/apache/spark";,
@@ -157,7 +157,7 @@ def is_active(instance):
 
 # Return correct versions of Spark and Shark, given the supplied Spark version
 def get_spark_shark_version(opts):
-  spark_shark_map = {"0.7.3": "0.7.1", "0.8.0": "0.8.0", "0.8.1": "0.8.1", 
"0.9.0": "0.9.0"}
+  spark_shark_map = {"0.7.3": "0.7.1", "0.8.0": "0.8.0", "0.8.1": "0.8.1", 
"0.9.0": "0.9.0", "0.9.1": "0.9.1"}
   version = opts.spark_version.replace("v", "")
   if version not in spark_shark_map:
     print >> stderr, "Don't know about Spark version: %s" % version

Reply via email to