Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/21669#discussion_r223519417
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/scheduler/cluster/k8s/KubernetesExecutorBuilder.scala
---
@@ -51,7 +67,25 @@ private[spark] class KubernetesExecutorBuilder(
Seq(provideVolumesStep(kubernetesConf))
} else Nil
- val allFeatures = baseFeatures ++ secretFeature ++ secretEnvFeature ++
volumesFeature
+ val maybeHadoopConfFeatureSteps = maybeHadoopConfigMap.map { _ =>
+ val maybeKerberosStep =
+ for {
--- End diff --
This seems like a weird way to say:
```
if (maybeDTSecretName.isDefined && maybeDTDataItem.isDefined) {
provideKerberosConfStep(kubernetesConf)
} else {
provideHadoopSparkUserStep(kubernetesConf)
}
```
As you may have noticed, I really dislike `for...yield`.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]