[ 
https://issues.apache.org/jira/browse/SPARK-11744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nicholas Chammas updated SPARK-11744:
-------------------------------------
    Description: 
{{bin/pyspark \-\-help}} offers a {{\-\-version}} option:

{code}
$ ./spark/bin/pyspark --help
Usage: ./bin/pyspark [options]

Options:
...
  --version,                  Print the version of current Spark
...
{code}

However, trying to get the version in this way doesn't yield the expected 
results.

Instead of printing the version and exiting, we get the version, a stack trace, 
and then get dropped into a broken PySpark shell.

{code}
$ ./spark/bin/pyspark --version
Python 2.7.10 (default, Aug 11 2015, 23:39:10) 
[GCC 4.8.3 20140911 (Red Hat 4.8.3-9)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.5.2
      /_/
                        
Type --help for more information.
Traceback (most recent call last):
  File "/home/ec2-user/spark/python/pyspark/shell.py", line 43, in <module>
    sc = SparkContext(pyFiles=add_files)
  File "/home/ec2-user/spark/python/pyspark/context.py", line 110, in __init__
    SparkContext._ensure_initialized(self, gateway=gateway)
  File "/home/ec2-user/spark/python/pyspark/context.py", line 234, in 
_ensure_initialized
    SparkContext._gateway = gateway or launch_gateway()
  File "/home/ec2-user/spark/python/pyspark/java_gateway.py", line 94, in 
launch_gateway
    raise Exception("Java gateway process exited before sending the driver its 
port number")
Exception: Java gateway process exited before sending the driver its port number
>>> 
>>> sc
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
NameError: name 'sc' is not defined
{code}

  was:
{{bin/pyspark \-\-help}} offers a {{\-\-version}} option:

{code}
$ ./spark/bin/pyspark --help
Usage: ./bin/pyspark [options]

Options:
...
  --version,                  Print the version of current Spark
...
{code}

However, trying to get the version in this way doesn't yield the expected 
results.

Instead of printing the version and exiting, we get the version, a stack trace, 
and then get dropped into a plain Python shell ({{sc}} is not defined).

{code}
$ ./spark/bin/pyspark --version
Python 2.7.10 (default, Aug 11 2015, 23:39:10) 
[GCC 4.8.3 20140911 (Red Hat 4.8.3-9)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.5.2
      /_/
                        
Type --help for more information.
Traceback (most recent call last):
  File "/home/ec2-user/spark/python/pyspark/shell.py", line 43, in <module>
    sc = SparkContext(pyFiles=add_files)
  File "/home/ec2-user/spark/python/pyspark/context.py", line 110, in __init__
    SparkContext._ensure_initialized(self, gateway=gateway)
  File "/home/ec2-user/spark/python/pyspark/context.py", line 234, in 
_ensure_initialized
    SparkContext._gateway = gateway or launch_gateway()
  File "/home/ec2-user/spark/python/pyspark/java_gateway.py", line 94, in 
launch_gateway
    raise Exception("Java gateway process exited before sending the driver its 
port number")
Exception: Java gateway process exited before sending the driver its port number
>>> 
>>> sc
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
NameError: name 'sc' is not defined
{code}


> bin/pyspark --version doesn't return version and exit
> -----------------------------------------------------
>
>                 Key: SPARK-11744
>                 URL: https://issues.apache.org/jira/browse/SPARK-11744
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.5.2
>            Reporter: Nicholas Chammas
>            Priority: Minor
>
> {{bin/pyspark \-\-help}} offers a {{\-\-version}} option:
> {code}
> $ ./spark/bin/pyspark --help
> Usage: ./bin/pyspark [options]
> Options:
> ...
>   --version,                  Print the version of current Spark
> ...
> {code}
> However, trying to get the version in this way doesn't yield the expected 
> results.
> Instead of printing the version and exiting, we get the version, a stack 
> trace, and then get dropped into a broken PySpark shell.
> {code}
> $ ./spark/bin/pyspark --version
> Python 2.7.10 (default, Aug 11 2015, 23:39:10) 
> [GCC 4.8.3 20140911 (Red Hat 4.8.3-9)] on linux2
> Type "help", "copyright", "credits" or "license" for more information.
> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 1.5.2
>       /_/
>                         
> Type --help for more information.
> Traceback (most recent call last):
>   File "/home/ec2-user/spark/python/pyspark/shell.py", line 43, in <module>
>     sc = SparkContext(pyFiles=add_files)
>   File "/home/ec2-user/spark/python/pyspark/context.py", line 110, in __init__
>     SparkContext._ensure_initialized(self, gateway=gateway)
>   File "/home/ec2-user/spark/python/pyspark/context.py", line 234, in 
> _ensure_initialized
>     SparkContext._gateway = gateway or launch_gateway()
>   File "/home/ec2-user/spark/python/pyspark/java_gateway.py", line 94, in 
> launch_gateway
>     raise Exception("Java gateway process exited before sending the driver 
> its port number")
> Exception: Java gateway process exited before sending the driver its port 
> number
> >>> 
> >>> sc
> Traceback (most recent call last):
>   File "<stdin>", line 1, in <module>
> NameError: name 'sc' is not defined
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to