Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/20404#discussion_r164109079
--- Diff: python/pyspark/sql/session.py ---
@@ -225,6 +225,7 @@ def __init__(self, sparkContext, jsparkSession=None):
if SparkSession._instantiatedSession is None \
or SparkSession._instantiatedSession._sc._jsc is None:
SparkSession._instantiatedSession = self
+
self._jvm.org.apache.spark.sql.SparkSession.setDefaultSession(self._jsparkSession)
--- End diff --
Actually, It seems not because we don't call this code path. Stop and start
logic is convoluted in PySpark in my humble opinion. Setting the default one
fixes an actual issue and seems we are okay with it, at least.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]