AngersZhuuuu commented on a change in pull request #31598:
URL: https://github.com/apache/spark/pull/31598#discussion_r581689593



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala
##########
@@ -897,6 +898,24 @@ object SparkSession extends Logging {
       this
     }
 
+    // These configurations related to driver when deploy like `spark.master`,
+    // `spark.driver.memory`, this kind of properties may not be affected when
+    // setting programmatically through SparkConf in runtime, or the behavior 
is
+    // depending on which cluster manager and deploy mode you choose, so it 
would
+    // be suggested to set through configuration file or spark-submit command 
line options.

Review comment:
       > @AngersZhuuuu, I would first document this explicitly and what happen 
for each configuration before taking an action to show a warning. Also, 
shouldn't we do this in `SparkContext` too?
   
   How about we add a new section in  
https://spark.apache.org/docs/latest/configuration.html  to collect and show 
configuration's usage scope. 
   Also we can add a `scope` tag in ConfigBuilder for these special 
configuration. 
   
   Then in this pr, we can just check the config type and warn message.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to