Repository: spark
Updated Branches:
  refs/heads/branch-1.0 4b9234905 -> 444ccdd80


SPARK-3242 [EC2] Spark 1.0.2 ec2 scripts creates clusters with Spark 1.0.1 
installed by default

tdas you recorded this as a blocker to-do for branch 1.0. Seemed easy, so 
here's a PR?

Author: Sean Owen <so...@cloudera.com>

Closes #4458 from srowen/SPARK-3242 and squashes the following commits:

58a5ede [Sean Owen] Update Spark version in ec2 script to 1.0.3


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/444ccdd8
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/444ccdd8
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/444ccdd8

Branch: refs/heads/branch-1.0
Commit: 444ccdd80ec5df249978d8498b4fc501cc3429d7
Parents: 4b92349
Author: Sean Owen <so...@cloudera.com>
Authored: Mon Feb 9 10:42:17 2015 -0800
Committer: Andrew Or <and...@databricks.com>
Committed: Mon Feb 9 10:42:17 2015 -0800

----------------------------------------------------------------------
 ec2/spark_ec2.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/444ccdd8/ec2/spark_ec2.py
----------------------------------------------------------------------
diff --git a/ec2/spark_ec2.py b/ec2/spark_ec2.py
index 348277a..a27910c 100755
--- a/ec2/spark_ec2.py
+++ b/ec2/spark_ec2.py
@@ -70,7 +70,7 @@ def parse_args():
            "slaves across multiple (an additional $0.01/Gb for bandwidth" +
            "between zones applies)")
   parser.add_option("-a", "--ami", help="Amazon Machine Image ID to use")
-  parser.add_option("-v", "--spark-version", default="1.0.1",
+  parser.add_option("-v", "--spark-version", default="1.0.3",
       help="Version of Spark to use: 'X.Y.Z' or a specific git hash")
   parser.add_option("--spark-git-repo",
       default="https://github.com/apache/spark";,
@@ -164,7 +164,8 @@ def is_active(instance):
 # Return correct versions of Spark and Shark, given the supplied Spark version
 def get_spark_shark_version(opts):
   spark_shark_map = {"0.7.3": "0.7.1", "0.8.0": "0.8.0", "0.8.1": "0.8.1", 
"0.9.0": "0.9.0", 
-    "0.9.1": "0.9.1", "1.0.0": "1.0.0", "1.0.1": "1.0.0", "1.0.2": "1.0.0"}
+    "0.9.1": "0.9.1", "1.0.0": "1.0.0", "1.0.1": "1.0.0", "1.0.2": "1.0.0", 
+    "1.0.3": "1.0.0"}
   version = opts.spark_version.replace("v", "")
   if version not in spark_shark_map:
     print >> stderr, "Don't know about Spark version: %s" % version


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to