Github user kiszk commented on a diff in the pull request:
https://github.com/apache/spark/pull/22746#discussion_r226245945
--- Diff: docs/sql-migration-guide-upgrade.md ---
@@ -0,0 +1,520 @@
+---
+layout: global
+title: Spark SQL Upgrading Guide
+displayTitle: Spark SQL Upgrading Guide
+---
+
+* Table of contents
+{:toc}
+
+## Upgrading From Spark SQL 2.4 to 3.0
+
+ - In PySpark, when creating a `SparkSession` with
`SparkSession.builder.getOrCreate()`, if there is an existing `SparkContext`,
the builder was trying to update the `SparkConf` of the existing `SparkContext`
with configurations specified to the builder, but the `SparkContext` is shared
by all `SparkSession`s, so we should not update them. Since 3.0, the builder
come to not update the configurations. This is the same behavior as Java/Scala
API in 2.3 and above. If you want to update them, you need to update them prior
to creating a `SparkSession`.
--- End diff --
`the builder come` -> `the builder comes`?
cc @ueshin
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]