Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21990#discussion_r209507286
  
    --- Diff: python/pyspark/sql/session.py ---
    @@ -218,7 +218,9 @@ def __init__(self, sparkContext, jsparkSession=None):
                             .sparkContext().isStopped():
                     jsparkSession = 
self._jvm.SparkSession.getDefaultSession().get()
                 else:
    -                jsparkSession = self._jvm.SparkSession(self._jsc.sc())
    +                jsparkSession = self._jvm.SparkSession.builder() \
    +                    .sparkContext(self._jsc.sc()) \
    +                    .getOrCreate()
    --- End diff --
    
    @RussellSpitzer, mind checking the logic `getOrCreate` inside Scala side 
and deduplicate them here while we are here? Some logics for instance setting 
default session, etc. are duplicated Here in Python side and there in Scala 
side.
    
    It would be nicer if we have some tests as well. `spark.sql.extensions` are 
static configuration, right? in that case, we could add a test, for example, 
please refer https://github.com/apache/spark/pull/21007. I added a test with 
static configuration before there.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to