Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22682#discussion_r223966211
  
    --- Diff: docs/sql-programming-guide.md ---
    @@ -1890,6 +1890,10 @@ working with timestamps in `pandas_udf`s to get the 
best performance, see
     
     # Migration Guide
     
    +## Upgrading From Spark SQL 2.4 to 3.0
    +
    +  - In PySpark, when creating a `SparkSession` with 
`SparkSession.builder.getOrCreate()`, if there is an existing `SparkContext`, 
the builder was trying to update the `SparkConf` of the existing `SparkContext` 
with configurations specified to the builder, but the `SparkContext` is shared 
by all `SparkSession`s, so we should not update them. Since 3.0, the builder 
come to not update the configurations. If you want to update them, you need to 
update them prior to creating a `SparkSession`.
    --- End diff --
    
    let's also mention that, this is already the behavior for Spark java/scala 
APIs.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to