Github user holdenk commented on a diff in the pull request:
https://github.com/apache/spark/pull/21092#discussion_r183153738
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
---
@@ -154,6 +176,13 @@ private[spark] object Config extends Logging {
.checkValue(interval => interval > 0, s"Logging interval must be a
positive time value.")
.createWithDefaultString("1s")
+ val MEMORY_OVERHEAD_FACTOR =
+ ConfigBuilder("spark.kubernetes.memoryOverheadFactor")
+ .doc("This sets the Memory Overhead Factor that will allocate memory
to non-JVM jobs " +
+ "which in the case of JVM tasks will default to 0.10 and 0.40 for
non-JVM jobs")
--- End diff --
+1 to this thanks for adding this.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]