Repository: spark
Updated Branches:
  refs/heads/branch-1.0 13fb4c782 -> b3ad707c4


[python alternative] pyspark require Python2, failing if system default is Py3 
from shell.py

Python alternative for https://github.com/apache/spark/pull/392; managed from 
shell.py

Author: AbhishekKr <abhikumar...@gmail.com>

Closes #399 from abhishekkr/pyspark_shell and squashes the following commits:

134bdc9 [AbhishekKr] pyspark require Python2, failing if system default is Py3 
from shell.py

(cherry picked from commit bb76eae1b50e4bf18360220110f7d0a4bee672ec)
Signed-off-by: Reynold Xin <r...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/b3ad707c
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/b3ad707c
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/b3ad707c

Branch: refs/heads/branch-1.0
Commit: b3ad707c4411c3860691c9eb802fec425bf29e85
Parents: 13fb4c7
Author: AbhishekKr <abhikumar...@gmail.com>
Authored: Wed Apr 16 19:05:40 2014 -0700
Committer: Reynold Xin <r...@apache.org>
Committed: Wed Apr 16 19:10:09 2014 -0700

----------------------------------------------------------------------
 python/pyspark/shell.py | 20 ++++++++++++++------
 1 file changed, 14 insertions(+), 6 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/b3ad707c/python/pyspark/shell.py
----------------------------------------------------------------------
diff --git a/python/pyspark/shell.py b/python/pyspark/shell.py
index 61613db..e8ba050 100644
--- a/python/pyspark/shell.py
+++ b/python/pyspark/shell.py
@@ -20,6 +20,14 @@ An interactive shell.
 
 This file is designed to be launched as a PYTHONSTARTUP script.
 """
+
+import sys
+if sys.version_info.major != 2:
+    print("Error: Default Python used is Python%s" % sys.version_info.major)
+    print("\tSet env variable PYSPARK_PYTHON to Python2 binary and re-run it.")
+    sys.exit(1)
+
+
 import os
 import platform
 import pyspark
@@ -34,21 +42,21 @@ if os.environ.get("SPARK_EXECUTOR_URI"):
 
 sc = SparkContext(os.environ.get("MASTER", "local[*]"), "PySparkShell", 
pyFiles=add_files)
 
-print """Welcome to
+print("""Welcome to
       ____              __
      / __/__  ___ _____/ /__
     _\ \/ _ \/ _ `/ __/  '_/
    /__ / .__/\_,_/_/ /_/\_\   version 1.0.0-SNAPSHOT
       /_/
-"""
-print "Using Python version %s (%s, %s)" % (
+""")
+print("Using Python version %s (%s, %s)" % (
     platform.python_version(),
     platform.python_build()[0],
-    platform.python_build()[1])
-print "Spark context available as sc."
+    platform.python_build()[1]))
+    print("Spark context available as sc.")
 
 if add_files != None:
-    print "Adding files: [%s]" % ", ".join(add_files)
+    print("Adding files: [%s]" % ", ".join(add_files))
 
 # The ./bin/pyspark script stores the old PYTHONSTARTUP value in 
OLD_PYTHONSTARTUP,
 # which allows us to execute the user's PYTHONSTARTUP file:

Reply via email to