AngersZhuuuu commented on a change in pull request #31598:
URL: https://github.com/apache/spark/pull/31598#discussion_r582506824
##########
File path: sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala
##########
@@ -897,6 +898,24 @@ object SparkSession extends Logging {
this
}
+ // These configurations related to driver when deploy like `spark.master`,
+ // `spark.driver.memory`, this kind of properties may not be affected when
+ // setting programmatically through SparkConf in runtime, or the behavior
is
+ // depending on which cluster manager and deploy mode you choose, so it
would
+ // be suggested to set through configuration file or spark-submit command
line options.
Review comment:
> I think you can just add some docs for each configuration like from
`blah blah` to `(Launcher scope) blah blah` or something like this.
Updated, how about current?
Warn message like
```
11:38:42.540 WARN org.apache.spark.sql.SparkSession$Builder: Since spark has
been submitted, such configurations
`spark.driver.memory -> 1g` may not take effect.
For how to set these configuration correctly, you can refer to
https://spark.apache.org/docs/latest/configuration.html#dynamically-loading-spark-properties.
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]