Repository: spark
Updated Branches:
  refs/heads/branch-1.0 171cea8ea -> d8767c43f


Fixed broken pyspark shell.

Author: Reynold Xin <[email protected]>

Closes #444 from rxin/pyspark and squashes the following commits:

fc11356 [Reynold Xin] Made the PySpark shell version checking compatible with 
Python 2.6.
571830b [Reynold Xin] Fixed broken pyspark shell.

(cherry picked from commit 81a152c54bff21854de731476f62c8fd50dd29f7)
Signed-off-by: Reynold Xin <[email protected]>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/d8767c43
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/d8767c43
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/d8767c43

Branch: refs/heads/branch-1.0
Commit: d8767c43f55646afd613259b9e06be0d8ea283fc
Parents: 171cea8
Author: Reynold Xin <[email protected]>
Authored: Fri Apr 18 10:10:13 2014 -0700
Committer: Reynold Xin <[email protected]>
Committed: Fri Apr 18 10:10:20 2014 -0700

----------------------------------------------------------------------
 python/pyspark/shell.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/d8767c43/python/pyspark/shell.py
----------------------------------------------------------------------
diff --git a/python/pyspark/shell.py b/python/pyspark/shell.py
index e8ba050..d172d58 100644
--- a/python/pyspark/shell.py
+++ b/python/pyspark/shell.py
@@ -22,7 +22,7 @@ This file is designed to be launched as a PYTHONSTARTUP 
script.
 """
 
 import sys
-if sys.version_info.major != 2:
+if sys.version_info[0] != 2:
     print("Error: Default Python used is Python%s" % sys.version_info.major)
     print("\tSet env variable PYSPARK_PYTHON to Python2 binary and re-run it.")
     sys.exit(1)
@@ -53,7 +53,7 @@ print("Using Python version %s (%s, %s)" % (
     platform.python_version(),
     platform.python_build()[0],
     platform.python_build()[1]))
-    print("Spark context available as sc.")
+print("SparkContext available as sc.")
 
 if add_files != None:
     print("Adding files: [%s]" % ", ".join(add_files))

Reply via email to