Github user foxish commented on a diff in the pull request:
https://github.com/apache/spark/pull/21092#discussion_r182524173
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/BasicDriverFeatureStep.scala
---
@@ -88,15 +94,22 @@ private[spark] class BasicDriverFeatureStep(
.addToRequests("memory", driverMemoryQuantity)
.addToLimits("memory", driverMemoryQuantity)
.endResources()
- .addToArgs("driver")
+ .addToArgs(driverDockerContainer)
.addToArgs("--properties-file", SPARK_CONF_PATH)
.addToArgs("--class", conf.roleSpecificConf.mainClass)
- // The user application jar is merged into the spark.jars list and
managed through that
- // property, so there is no need to reference it explicitly here.
- .addToArgs(SparkLauncher.NO_RESOURCE)
- .addToArgs(conf.roleSpecificConf.appArgs: _*)
- .build()
+ val driverContainer =
+ if (driverDockerContainer == "driver-py") {
--- End diff --
Wondering if we can discover if it's a Python application in a better way
here. Probably using the built up spark conf?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]