Github user holdenk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22295#discussion_r222699236
  
    --- Diff: python/pyspark/sql/session.py ---
    @@ -231,6 +231,7 @@ def __init__(self, sparkContext, jsparkSession=None):
                     or SparkSession._instantiatedSession._sc._jsc is None:
                 SparkSession._instantiatedSession = self
                 self._jvm.SparkSession.setDefaultSession(self._jsparkSession)
    +            self._jvm.SparkSession.setActiveSession(self._jsparkSession)
    --- End diff --
    
    So @HyukjinKwon in this code session1 and session2 are already equal:
    
    > Welcome to
    >       ____              __
    >      / __/__  ___ _____/ /__
    >     _\ \/ _ \/ _ `/ __/  '_/
    >    /__ / .__/\_,_/_/ /_/\_\   version 2.3.1
    >       /_/
    > 
    > Using Python version 3.6.5 (default, Apr 29 2018 16:14:56)
    > SparkSession available as 'spark'.
    > >>> session1 = SparkSession.builder.config("key1", "value1").getOrCreate()
    > >>> session2 = SparkSession.builder.config("key2", "value2").getOrCreate()
    > >>> session1
    > <pyspark.sql.session.SparkSession object at 0x7ff6d4843b00>
    > >>> session2
    > <pyspark.sql.session.SparkSession object at 0x7ff6d4843b00>
    > >>> session1 == session2
    > True
    > >>> 
    > 
    > 
    > 
    > 
    > 
    
    That being said the possibility of having multiple Spark session in Python 
is doable you manually have to call the init e.g.:
    
    > >>> session3 = SparkSession(sc)
    > >>> session3
    > <pyspark.sql.session.SparkSession object at 0x7ff6d3dbd160>
    > >>> 
    > 
    
    And supporting that is reasonable.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to