Github user kokes commented on a diff in the pull request:
https://github.com/apache/spark/pull/21092#discussion_r193639554
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
---
@@ -154,6 +176,24 @@ private[spark] object Config extends Logging {
.checkValue(interval => interval > 0, s"Logging interval must be a
positive time value.")
.createWithDefaultString("1s")
+ val MEMORY_OVERHEAD_FACTOR =
+ ConfigBuilder("spark.kubernetes.memoryOverheadFactor")
+ .doc("This sets the Memory Overhead Factor that will allocate memory
to non-JVM jobs " +
+ "which in the case of JVM tasks will default to 0.10 and 0.40 for
non-JVM jobs")
+ .doubleConf
+ .checkValue(mem_overhead => mem_overhead >= 0 && mem_overhead < 1,
+ "Ensure that memory overhead is a double between 0 --> 1.0")
+ .createWithDefault(0.1)
+
+ val PYSPARK_MAJOR_PYTHON_VERSION =
+ ConfigBuilder("spark.kubernetes.pyspark.pythonversion")
+ .doc("This sets the python version. Either 2 or 3. (Python2 or
Python3)")
+ .stringConf
+ .checkValue(pv => List("2", "3").contains(pv),
+ "Ensure that Python Version is either Python2 or Python3")
+ .createWithDefault("2")
--- End diff --
Am I reading this right that the default is Python 2? Is there a reason for
that? Thanks!
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]