Merge pull request #71 from aarondav/scdefaults

Spark shell exits if it cannot create SparkContext

Mainly, this occurs if you provide a messed up MASTER url (one that doesn't 
match one
of our regexes). Previously, we would default to Mesos, fail, and then start 
the shell
anyway, except that any Spark command would fail. Simply exiting seems clearer.


Project: http://git-wip-us.apache.org/repos/asf/incubator-spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-spark/commit/8d528af8
Tree: http://git-wip-us.apache.org/repos/asf/incubator-spark/tree/8d528af8
Diff: http://git-wip-us.apache.org/repos/asf/incubator-spark/diff/8d528af8

Branch: refs/heads/master
Commit: 8d528af829dc989d4701c08fd90d230c15df7f7e
Parents: fc26e5b 7473726
Author: Matei Zaharia <ma...@eecs.berkeley.edu>
Authored: Fri Oct 18 20:24:10 2013 -0700
Committer: Matei Zaharia <ma...@eecs.berkeley.edu>
Committed: Fri Oct 18 20:24:10 2013 -0700

----------------------------------------------------------------------
 .../src/main/scala/org/apache/spark/SparkContext.scala | 13 ++++++-------
 .../main/scala/org/apache/spark/repl/SparkILoop.scala  |  9 ++++++++-
 2 files changed, 14 insertions(+), 8 deletions(-)
----------------------------------------------------------------------


Reply via email to