Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/19717#discussion_r154751082
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
---
@@ -119,5 +139,60 @@ private[spark] object Config extends Logging {
"must be a positive integer")
.createWithDefault(10)
+ val WAIT_FOR_APP_COMPLETION =
+ ConfigBuilder("spark.kubernetes.submission.waitAppCompletion")
+ .doc("In cluster mode, whether to wait for the application to finish
before exiting the " +
+ "launcher process.")
+ .booleanConf
+ .createWithDefault(true)
+
+ val REPORT_INTERVAL =
+ ConfigBuilder("spark.kubernetes.report.interval")
+ .doc("Interval between reports of the current app status in cluster
mode.")
+ .timeConf(TimeUnit.MILLISECONDS)
+ .createWithDefaultString("1s")
+
+ private[spark] val JARS_DOWNLOAD_LOCATION =
+ ConfigBuilder("spark.kubernetes.mountDependencies.jarsDownloadDir")
+ .doc("Location to download jars to in the driver and executors. When
using" +
+ " spark-submit, this directory must be empty and will be mounted
as an empty directory" +
+ " volume on the driver and executor pod.")
+ .stringConf
+ .createWithDefault("/var/spark-data/spark-jars")
--- End diff --
The doc string says "download jars to". Is it guaranteed that this
directory will be writable? Generally only root can write to things in "/var"
by default, and I assume you're not running things as root even if it's inside
a containers.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]