GitHub user rdblue opened a pull request:
https://github.com/apache/spark/pull/16519
SPARK-19138: Don't return SparkSession for stopped SparkContext.
## What changes were proposed in this pull request?
* Update SparkSession to always return a session using the current
SparkContext
* Add SparkContext#isStopped
## How was this patch tested?
Tested that sqlContext.sql works when created after closing a Spark context:
```python
sc.stop()
sc = SparkContext(conf=SparkConf())
sqlContext = HiveContext(sc)
```
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/rdblue/spark
SPARK-19138-pyspark-stale-spark-context
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/16519.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #16519
----
commit 6d56df1e0d8c95e04eea41cd9776c72b34f9998b
Author: Ryan Blue <[email protected]>
Date: 2017-01-09T21:28:32Z
SPARK-19138: Don't return SparkSession for stopped SparkContext.
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]