tgravescs commented on a change in pull request #30206:
URL: https://github.com/apache/spark/pull/30206#discussion_r515991974
##########
File path:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
##########
@@ -211,6 +211,24 @@ private[spark] object Config extends Logging {
.stringConf
.createOptional
+ val KUBERNETES_DRIVER_POD_FEATURE_STEPS =
+ ConfigBuilder("spark.kubernetes.driver.pod.featureSteps")
+ .doc("Class names of an extra driver pod feature step implementing " +
+ "KubernetesFeatureConfigStep. This is a developer API. Comma
separated.")
Review comment:
we should say when these ones run in accordance with existing ones
##########
File path:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KubernetesFeatureConfigStep.scala
##########
@@ -18,13 +18,17 @@ package org.apache.spark.deploy.k8s.features
import io.fabric8.kubernetes.api.model.HasMetadata
+import org.apache.spark.annotation.DeveloperApi
import org.apache.spark.deploy.k8s.SparkPod
/**
+ * :: DeveloperApi ::
+ *
* A collection of functions that together represent a "feature" in pods that
are launched for
* Spark drivers and executors.
*/
-private[spark] trait KubernetesFeatureConfigStep {
+@DeveloperApi
+trait KubernetesFeatureConfigStep {
Review comment:
If we are making this a pluggable api it would be better to make this a
java interface like the ExecutorPlugin, DrivePlugin, ResourceDiscoveryPlugin
and LocalDiskShuffleDataIO.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]