attilapiros commented on a change in pull request #35345:
URL: https://github.com/apache/spark/pull/35345#discussion_r795627392
##########
File path:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
##########
@@ -341,7 +341,9 @@ private[spark] object Config extends Logging {
ConfigBuilder("spark.kubernetes.driver.pod.featureSteps")
.doc("Class names of an extra driver pod feature step implementing " +
"KubernetesFeatureConfigStep. This is a developer API. Comma
separated. " +
- "Runs after all of Spark internal feature steps.")
+ "Runs after all of Spark internal feature steps. After 3.3.0, you can
extend your " +
Review comment:
Please update `docs/running-on-kubernetes.md` too
##########
File path:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KubernetesFeatureConfigStep.scala
##########
@@ -19,7 +19,61 @@ package org.apache.spark.deploy.k8s.features
import io.fabric8.kubernetes.api.model.HasMetadata
import org.apache.spark.annotation.{DeveloperApi, Unstable}
-import org.apache.spark.deploy.k8s.SparkPod
+import org.apache.spark.deploy.k8s.{KubernetesConf, KubernetesDriverConf,
KubernetesExecutorConf, SparkPod}
+
+/**
+ * :: DeveloperApi ::
+ *
+ * A base class to help user extend custom feature step in driver and executor
side.
+ * Note: If your custom feature step would be used in both driver and
executor, please use this.
+ */
+@Unstable
+@DeveloperApi
+trait KubernetesCustomFeatureConfigStep extends KubernetesFeatureConfigStep {
+
+ protected var kubernetesConf: KubernetesConf = _
+
+ override def init(conf: KubernetesConf): KubernetesFeatureConfigStep = {
Review comment:
I would not return `this` as it makes the client think about the reason
behind (which is just saving one line).
##########
File path:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KubernetesFeatureConfigStep.scala
##########
@@ -19,7 +19,58 @@ package org.apache.spark.deploy.k8s.features
import io.fabric8.kubernetes.api.model.HasMetadata
import org.apache.spark.annotation.{DeveloperApi, Unstable}
-import org.apache.spark.deploy.k8s.SparkPod
+import org.apache.spark.deploy.k8s.{KubernetesConf, KubernetesDriverConf,
KubernetesExecutorConf, SparkPod}
+
+/**
+ * :: DeveloperApi ::
+ *
+ * A base class to help user extend custom feature step in driver and executor
side.
+ * Note: If your custom feature step would be used in both driver and
executor, please use this.
+ */
+@Unstable
+@DeveloperApi
+trait KubernetesCustomFeatureConfigStep extends KubernetesFeatureConfigStep {
+
+ protected var kubernetesConf: KubernetesConf = _
+
+ override def init(conf: KubernetesConf): Unit = {
+ kubernetesConf = conf
+ }
+}
+
+/**
+ * :: DeveloperApi ::
+ *
+ * A base class to help user extend custom feature step in driver side.
+ * Note: If your custom feature step would be used in only driver, please use
this.
+ */
+@Unstable
+@DeveloperApi
+trait KubernetesDriverCustomFeatureConfigStep extends
KubernetesFeatureConfigStep {
+
+ protected var driverConf: KubernetesDriverConf = _
Review comment:
ditto
##########
File path:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KubernetesFeatureConfigStep.scala
##########
@@ -19,7 +19,58 @@ package org.apache.spark.deploy.k8s.features
import io.fabric8.kubernetes.api.model.HasMetadata
import org.apache.spark.annotation.{DeveloperApi, Unstable}
-import org.apache.spark.deploy.k8s.SparkPod
+import org.apache.spark.deploy.k8s.{KubernetesConf, KubernetesDriverConf,
KubernetesExecutorConf, SparkPod}
+
+/**
+ * :: DeveloperApi ::
+ *
+ * A base class to help user extend custom feature step in driver and executor
side.
+ * Note: If your custom feature step would be used in both driver and
executor, please use this.
+ */
+@Unstable
+@DeveloperApi
+trait KubernetesCustomFeatureConfigStep extends KubernetesFeatureConfigStep {
+
+ protected var kubernetesConf: KubernetesConf = _
Review comment:
I do not think this is needed. Let the implementation decide how and
what is stored after `KubernetesConf` is got in the `init`. Like just a value
is needed or a computed object based on a config setting...
##########
File path:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KubernetesFeatureConfigStep.scala
##########
@@ -31,6 +85,13 @@ import org.apache.spark.deploy.k8s.SparkPod
@DeveloperApi
trait KubernetesFeatureConfigStep {
+ /**
+ * Initialize the configuration for user feature step, this only applicable
when user specified
+ * `spark.kubernetes.executor.pod.featureSteps` or
`spark.kubernetes.executor.pod.featureSteps`,
+ * the init would be called after feature step loading.
+ */
+ def init(config: KubernetesConf): KubernetesFeatureConfigStep = this
Review comment:
I would remove this.
##########
File path:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/KubernetesUtils.scala
##########
@@ -381,4 +382,13 @@ object KubernetesUtils extends Logging {
}
}
}
+
+ // Load and init custom feature step according to `className` and `conf`
+ // This method is used by `KubernetesDriverBuilder` and
`KubernetesExecutorBuilder`
+ @Since("3.3.0")
+ def loadFeatureStep(conf: KubernetesConf, className: String):
KubernetesFeatureConfigStep = {
Review comment:
I would remove this `def` and would use something like like this instead
in for example at `KubernetesExecutorBuilder`:
```scala
val feature = Utils.classForName[Any](className).newInstance()
val initializedFeature = feature match {
case e: KubernetesExecutorCustomFeatureConfigStep =>
e.init(conf)
Some(e)
case _: KubernetesDriverCustomFeatureConfigStep =>
None
case b: KubernetesCustomFeatureConfigStep =>
Some(b)
case _ =>
None
}
initializedFeature.getOrElse {
logError("...")
throw new SparkException("...")
}
```
This way `conf` has the correct type (`KubernetesExecutorConf`) and we can
check for the correct accpeted types. Regarding
`KubernetesDriverCustomFeatureConfigStep` I won't introduce the `init` at all.
If you agree please run some manual tests with wrong types for both for the
executor and for the driver.
##########
File path:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/Config.scala
##########
@@ -351,7 +353,9 @@ private[spark] object Config extends Logging {
ConfigBuilder("spark.kubernetes.executor.pod.featureSteps")
.doc("Class name of an extra executor pod feature step implementing " +
"KubernetesFeatureConfigStep. This is a developer API. Comma
separated. " +
- "Runs after all of Spark internal feature steps.")
+ "Runs after all of Spark internal feature steps. After 3.3.0, you can
extend your " +
Review comment:
likewise
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]