panbingkun commented on code in PR #36273:
URL: https://github.com/apache/spark/pull/36273#discussion_r853849065
##########
core/src/main/scala/org/apache/spark/SparkConf.scala:
##########
@@ -529,6 +529,17 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable
with Logging with Seria
s"(was '$javaOpts'). Use spark.executor.memory instead."
throw new Exception(msg)
}
+ if (javaOpts.contains("-Xms") && contains(EXECUTOR_MEMORY)) {
Review Comment:
> `EXECUTOR_MEMORY` has default value, so I don't think
`contains(EXECUTOR_MEMORY)` necessary condition
in SparkContext Initialization process,code as follow:
[ _conf = config.clone()
_conf.validateSettings()
...
...](https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkContext.scala#L390-L393)
[ _executorMemory = _conf.getOption(EXECUTOR_MEMORY.key)
.orElse(Option(System.getenv("SPARK_EXECUTOR_MEMORY")))
.orElse(Option(System.getenv("SPARK_MEM"))
.map(warnSparkMem))
.map(Utils.memoryStringToMb)
.getOrElse(1024)
....](https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkContext.scala#L527-L532)
The patch will throw exception when set spark.executor.extraJavaOptions &
spark.executor.memory by manual.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]