Github user holdenk commented on a diff in the pull request:
https://github.com/apache/spark/pull/21092#discussion_r189981496
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
---
@@ -154,6 +176,24 @@ private[spark] object Config extends Logging {
.checkValue(interval => interval > 0, s"Logging interval must be a
positive time value.")
.createWithDefaultString("1s")
+ val MEMORY_OVERHEAD_FACTOR =
+ ConfigBuilder("spark.kubernetes.memoryOverheadFactor")
+ .doc("This sets the Memory Overhead Factor that will allocate memory
to non-JVM jobs " +
+ "which in the case of JVM tasks will default to 0.10 and 0.40 for
non-JVM jobs")
+ .doubleConf
+ .checkValue(mem_overhead => mem_overhead >= 0 && mem_overhead < 1,
+ "Ensure that memory overhead is a double between 0 --> 1.0")
+ .createOptional
+
+ val PYSPARK_PYTHON_VERSION =
--- End diff --
This is minor, but I have a few questions about this element of the config.
First of if this is going to be majour version only lets call it something
like majourPythonVersion (e.g. many python2 and python3s exist).
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]