Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19717#discussion_r153407637
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
---
@@ -0,0 +1,160 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s
+
+import java.util.concurrent.TimeUnit
+
+import org.apache.spark.{SPARK_VERSION => sparkVersion}
+import org.apache.spark.internal.Logging
+import org.apache.spark.internal.config.ConfigBuilder
+import org.apache.spark.network.util.ByteUnit
+
+private[spark] object Config extends Logging {
+
+ val KUBERNETES_NAMESPACE =
+ ConfigBuilder("spark.kubernetes.namespace")
+ .doc("The namespace that will be used for running the driver and
executor pods. When using " +
+ "spark-submit in cluster mode, this can also be passed to
spark-submit via the " +
+ "--kubernetes-namespace command line argument.")
+ .stringConf
+ .createWithDefault("default")
+
+ val DRIVER_DOCKER_IMAGE =
+ ConfigBuilder("spark.kubernetes.driver.docker.image")
+ .doc("Docker image to use for the driver. Specify this using the
standard Docker tag format.")
+ .stringConf
+ .createWithDefault(s"spark-driver:$sparkVersion")
+
+ val EXECUTOR_DOCKER_IMAGE =
+ ConfigBuilder("spark.kubernetes.executor.docker.image")
+ .doc("Docker image to use for the executors. Specify this using the
standard Docker tag " +
+ "format.")
+ .stringConf
+ .createWithDefault(s"spark-executor:$sparkVersion")
+
+ val DOCKER_IMAGE_PULL_POLICY =
+ ConfigBuilder("spark.kubernetes.docker.image.pullPolicy")
--- End diff --
This configuration seems like a set of String options, we should check all
the legal options like this SQL conf:
```scala
val CATALOG_IMPLEMENTATION =
buildStaticConf("spark.sql.catalogImplementation")
.internal()
.stringConf
.checkValues(Set("hive", "in-memory"))
.createWithDefault("in-memory")
```
Beside we should add `checkValue` for the `ConfigEntry` if possible.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]