amaliujia commented on code in PR #48526:
URL: https://github.com/apache/spark/pull/48526#discussion_r1805242631


##########
sql/core/src/main/scala/org/apache/spark/sql/internal/RuntimeConfigImpl.scala:
##########
@@ -85,6 +85,15 @@ class RuntimeConfigImpl private[sql](val sqlConf: SQLConf = 
new SQLConf) extends
   }
 
   private[sql] def requireNonStaticConf(key: String): Unit = {
+    // We documented `spark.default.parallelism` by SPARK-48773, however this 
config
+    // is actually a static config so now a 
spark.session.set("spark.default.parallelism")
+    // will fail. Before SPARK-48773 it does not, then this becomes a behavior 
change.
+    // Technically the current behavior is correct, however it still forms a 
behavior change.
+    // To address the change, we need a check here and do not fail on default 
parallelism
+    // setting through spark session to maintain the same behavior.
+    if (key == DEFAULT_PARALLELISM.key) {

Review Comment:
   Added. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to