Github user mccheah commented on a diff in the pull request:
https://github.com/apache/spark/pull/22146#discussion_r212448189
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
---
@@ -225,6 +225,18 @@ private[spark] object Config extends Logging {
"Ensure that major Python version is either Python2 or Python3")
.createWithDefault("2")
+ val KUBERNETES_DRIVER_CONTAINER_NAME =
+ ConfigBuilder("spark.kubernetes.driver.containerName")
--- End diff --
This feature should support using multiple containers, in which case the
user needs to specify which container is running the Spark process. Using a
configuration option for that seems like the most straightforward solution.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]