Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/20404#discussion_r164361700
--- Diff: python/pyspark/sql/session.py ---
@@ -213,7 +213,12 @@ def __init__(self, sparkContext, jsparkSession=None):
self._jsc = self._sc._jsc
self._jvm = self._sc._jvm
if jsparkSession is None:
- jsparkSession = self._jvm.SparkSession(self._jsc.sc())
+ if self._jvm.SparkSession.getDefaultSession().isDefined() \
+ and not
self._jvm.SparkSession.getDefaultSession().get() \
+ .sparkContext().isStopped():
--- End diff --
I guess this change at 4ba3aa2af1b7bbc69575c14fffed18d5f1f90d53 is enough
to fix the previous test failure (`ERROR:
test_sparksession_with_stopped_sparkcontext (pyspark.sql.tests.SQLTests2)`) and
we can revert moving `self._jvm.SparkSession.clearDefaultSession()` to
`SparkContext.stop()` at
0319fa5c0527f68f3a3862afbbfd1b708f1d307d now.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]