AngersZhuuuu commented on a change in pull request #31598:
URL: https://github.com/apache/spark/pull/31598#discussion_r584451204
##########
File path: sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala
##########
@@ -897,6 +898,18 @@ object SparkSession extends Logging {
this
}
+ // These configurations related to driver when deploy like `spark.master`,
+ // `spark.driver.memory`, this kind of properties may not be affected when
+ // setting programmatically through SparkConf in runtime, or the behavior
is
+ // depending on which cluster manager and deploy mode you choose, so it
would
+ // be suggested to set through configuration file or spark-submit command
line options.
+ private val DRIVER_RELATED_LAUNCHER_CONFIG = Seq(DRIVER_MEMORY,
DRIVER_CORES.key,
+ DRIVER_MEMORY_OVERHEAD.key, DRIVER_EXTRA_CLASSPATH,
+ DRIVER_DEFAULT_JAVA_OPTIONS, DRIVER_EXTRA_JAVA_OPTIONS,
DRIVER_EXTRA_LIBRARY_PATH,
+ "spark.driver.resource", PYSPARK_DRIVER_PYTHON, PYSPARK_PYTHON,
SPARKR_R_SHELL,
+ CHILD_PROCESS_LOGGER_NAME, CHILD_CONNECTION_TIMEOUT,
DRIVER_USER_CLASS_PATH_FIRST.key,
+ "spark.yarn.*")
Review comment:
> @AngersZhuuuu, are they all configurations to include? is it just a
subset of them?
That's what I found through configuration page. I am not sure if there is
configurations in code that I missed.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]