Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/20404#discussion_r164349606
--- Diff: python/pyspark/sql/session.py ---
@@ -760,6 +764,7 @@ def stop(self):
"""Stop the underlying :class:`SparkContext`.
"""
self._sc.stop()
+ self._jvm.SparkSession.clearDefaultSession()
--- End diff --
hmm.. If we didn't set it in L231, perhaps we shouldn't clear it?
Or if we are picking up the JVM one in L217, we shouldn't clear it either?
WDYT?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]