[ 
https://issues.apache.org/jira/browse/SPARK-1458?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14075126#comment-14075126
 ] 

Apache Spark commented on SPARK-1458:
-------------------------------------

User 'JoshRosen' has created a pull request for this issue:
https://github.com/apache/spark/pull/1596

> Expose sc.version in PySpark
> ----------------------------
>
>                 Key: SPARK-1458
>                 URL: https://issues.apache.org/jira/browse/SPARK-1458
>             Project: Spark
>          Issue Type: New Feature
>          Components: PySpark, Spark Core
>    Affects Versions: 0.9.0
>            Reporter: Nicholas Chammas
>            Assignee: Josh Rosen
>            Priority: Minor
>
> As discussed 
> [here|http://apache-spark-user-list.1001560.n3.nabble.com/programmatic-way-to-tell-Spark-version-td1929.html],
>  I think it would be nice if there was a way to programmatically determine 
> what version of Spark you are running. 
> The potential use cases are not that important, but they include:
> # Branching your code based on what version of Spark is running.
> # Checking your version without having to quit and restart the Spark shell.
> Right now in PySpark, I believe the only way to determine your version is by 
> firing up the Spark shell and looking at the startup banner.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to