[
https://issues.apache.org/jira/browse/SPARK-1458?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Nicholas Chammas reopened SPARK-1458:
-------------------------------------
Reopening this ticket.
It seems that, while the Scala shell exposes the Spark version via
{{sc.version}}, this does not appear to be available in PySpark.
> Add programmatic way to determine Spark version
> -----------------------------------------------
>
> Key: SPARK-1458
> URL: https://issues.apache.org/jira/browse/SPARK-1458
> Project: Spark
> Issue Type: New Feature
> Components: PySpark, Spark Core
> Affects Versions: 0.9.0
> Reporter: Nicholas Chammas
> Priority: Minor
> Fix For: 1.0.0
>
>
> As discussed
> [here|http://apache-spark-user-list.1001560.n3.nabble.com/programmatic-way-to-tell-Spark-version-td1929.html],
> I think it would be nice if there was a way to programmatically determine
> what version of Spark you are running.
> The potential use cases are not that important, but they include:
> # Branching your code based on what version of Spark is running.
> # Checking your version without having to quit and restart the Spark shell.
> Right now in PySpark, I believe the only way to determine your version is by
> firing up the Spark shell and looking at the startup banner.
--
This message was sent by Atlassian JIRA
(v6.2#6252)