[jira] [Commented] (SPARK-25877) Put all feature-related code in the feature step itself

2018-12-12 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16719388#comment-16719388
 ] 

ASF GitHub Bot commented on SPARK-25877:


asfgit closed pull request #23220: [SPARK-25877][k8s] Move all feature logic to 
feature classes.
URL: https://github.com/apache/spark/pull/23220
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git 
a/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/HadoopConfExecutorFeatureStep.scala
 
b/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/HadoopConfExecutorFeatureStep.scala
index bca66759d586e..da332881ae1a2 100644
--- 
a/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/HadoopConfExecutorFeatureStep.scala
+++ 
b/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/HadoopConfExecutorFeatureStep.scala
@@ -31,10 +31,10 @@ private[spark] class HadoopConfExecutorFeatureStep(conf: 
KubernetesExecutorConf)
 
   override def configurePod(pod: SparkPod): SparkPod = {
 val hadoopConfDirCMapName = conf.getOption(HADOOP_CONFIG_MAP_NAME)
-require(hadoopConfDirCMapName.isDefined,
-  "Ensure that the env `HADOOP_CONF_DIR` is defined either in the client 
or " +
-" using pre-existing ConfigMaps")
-logInfo("HADOOP_CONF_DIR defined")
-HadoopBootstrapUtil.bootstrapHadoopConfDir(None, None, 
hadoopConfDirCMapName, pod)
+if (hadoopConfDirCMapName.isDefined) {
+  HadoopBootstrapUtil.bootstrapHadoopConfDir(None, None, 
hadoopConfDirCMapName, pod)
+} else {
+  pod
+}
   }
 }
diff --git 
a/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/HadoopSparkUserExecutorFeatureStep.scala
 
b/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/HadoopSparkUserExecutorFeatureStep.scala
index e342110763196..c038e75491ca5 100644
--- 
a/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/HadoopSparkUserExecutorFeatureStep.scala
+++ 
b/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/HadoopSparkUserExecutorFeatureStep.scala
@@ -28,7 +28,8 @@ private[spark] class HadoopSparkUserExecutorFeatureStep(conf: 
KubernetesExecutor
   extends KubernetesFeatureConfigStep {
 
   override def configurePod(pod: SparkPod): SparkPod = {
-val sparkUserName = conf.get(KERBEROS_SPARK_USER_NAME)
-HadoopBootstrapUtil.bootstrapSparkUserPod(sparkUserName, pod)
+conf.getOption(KERBEROS_SPARK_USER_NAME).map { user =>
+  HadoopBootstrapUtil.bootstrapSparkUserPod(user, pod)
+}.getOrElse(pod)
   }
 }
diff --git 
a/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KerberosConfExecutorFeatureStep.scala
 
b/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KerberosConfExecutorFeatureStep.scala
index 32bb6a5d2bcbb..907271b1cb483 100644
--- 
a/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KerberosConfExecutorFeatureStep.scala
+++ 
b/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KerberosConfExecutorFeatureStep.scala
@@ -27,18 +27,20 @@ import org.apache.spark.internal.Logging
 private[spark] class KerberosConfExecutorFeatureStep(conf: 
KubernetesExecutorConf)
   extends KubernetesFeatureConfigStep with Logging {
 
-  private val maybeKrb5CMap = conf.getOption(KRB5_CONFIG_MAP_NAME)
-  require(maybeKrb5CMap.isDefined, "HADOOP_CONF_DIR ConfigMap not found")
-
   override def configurePod(pod: SparkPod): SparkPod = {
-logInfo(s"Mounting Resources for Kerberos")
-HadoopBootstrapUtil.bootstrapKerberosPod(
-  conf.get(KERBEROS_DT_SECRET_NAME),
-  conf.get(KERBEROS_DT_SECRET_KEY),
-  conf.get(KERBEROS_SPARK_USER_NAME),
-  None,
-  None,
-  maybeKrb5CMap,
-  pod)
+val maybeKrb5CMap = conf.getOption(KRB5_CONFIG_MAP_NAME)
+if (maybeKrb5CMap.isDefined) {
+  logInfo(s"Mounting Resources for Kerberos")
+  HadoopBootstrapUtil.bootstrapKerberosPod(
+conf.get(KERBEROS_DT_SECRET_NAME),
+conf.get(KERBEROS_DT_SECRET_KEY),
+conf.get(KERBEROS_SPARK_USER_NAME),
+None,
+None,
+maybeKrb5CMap,
+pod)
+} else {
+  pod
+}
   }
 }
diff --git 
a/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/PodTemplateConfigMapStep.scala
 
b/resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/PodTemplateConfigMapStep.scala
index 09dcf93a54f8e..7f41ca43589b6 100644
--- 

[jira] [Commented] (SPARK-25877) Put all feature-related code in the feature step itself

2018-12-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16718208#comment-16718208
 ] 

ASF GitHub Bot commented on SPARK-25877:


mccheah commented on issue #23220: [SPARK-25877][k8s] Move all feature logic to 
feature classes.
URL: https://github.com/apache/spark/pull/23220#issuecomment-446405494
 
 
   +1 from me, would like @liyinan926 to take a second look


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Put all feature-related code in the feature step itself
> ---
>
> Key: SPARK-25877
> URL: https://issues.apache.org/jira/browse/SPARK-25877
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes
>Affects Versions: 2.4.0
>Reporter: Marcelo Vanzin
>Priority: Major
>
> This is a child task of SPARK-25874. It covers having all the code related to 
> features in the feature steps themselves, including logic about whether a 
> step should be applied or not.
> Please refer to the parent bug for further details.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25877) Put all feature-related code in the feature step itself

2018-12-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16718201#comment-16718201
 ] 

ASF GitHub Bot commented on SPARK-25877:


mccheah commented on a change in pull request #23220: [SPARK-25877][k8s] Move 
all feature logic to feature classes.
URL: https://github.com/apache/spark/pull/23220#discussion_r240833837
 
 

 ##
 File path: 
resource-managers/kubernetes/core/src/test/scala/org/apache/spark/deploy/k8s/PodBuilderSuite.scala
 ##
 @@ -0,0 +1,177 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s
+
+import java.io.File
+
+import io.fabric8.kubernetes.api.model.{Config => _, _}
+import io.fabric8.kubernetes.client.KubernetesClient
+import io.fabric8.kubernetes.client.dsl.{MixedOperation, PodResource}
+import org.mockito.Matchers.any
+import org.mockito.Mockito.{mock, never, verify, when}
+import scala.collection.JavaConverters._
+
+import org.apache.spark.{SparkConf, SparkException, SparkFunSuite}
+import org.apache.spark.deploy.k8s._
+import org.apache.spark.internal.config.ConfigEntry
+
+abstract class PodBuilderSuite extends SparkFunSuite {
+
+  protected def templateFileConf: ConfigEntry[_]
+
+  protected def buildPod(sparkConf: SparkConf, client: KubernetesClient): 
SparkPod
+
+  private val baseConf = new SparkConf(false)
+.set(Config.CONTAINER_IMAGE, "spark-executor:latest")
+
+  test("use empty initial pod if template is not specified") {
+val client = mock(classOf[KubernetesClient])
+buildPod(baseConf.clone(), client)
+verify(client, never()).pods()
+  }
+
+  test("load pod template if specified") {
+val client = mockKubernetesClient()
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val pod = buildPod(sparkConf, client)
+verifyPod(pod)
+  }
+
+  test("complain about misconfigured pod template") {
+val client = mockKubernetesClient(
+  new PodBuilder()
+.withNewMetadata()
+.addToLabels("test-label-key", "test-label-value")
+.endMetadata()
+.build())
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val exception = intercept[SparkException] {
+  buildPod(sparkConf, client)
+}
+assert(exception.getMessage.contains("Could not load pod from template 
file."))
+  }
+
+  private def mockKubernetesClient(pod: Pod = podWithSupportedFeatures()): 
KubernetesClient = {
+val kubernetesClient = mock(classOf[KubernetesClient])
+val pods =
+  mock(classOf[MixedOperation[Pod, PodList, DoneablePod, PodResource[Pod, 
DoneablePod]]])
+val podResource = mock(classOf[PodResource[Pod, DoneablePod]])
+when(kubernetesClient.pods()).thenReturn(pods)
+when(pods.load(any(classOf[File]))).thenReturn(podResource)
+when(podResource.get()).thenReturn(pod)
+kubernetesClient
+  }
+
+  private def verifyPod(pod: SparkPod): Unit = {
 
 Review comment:
   > That's also an argument for not restoring the mocks, which would go 
against what this change is doing. This test should account for modifications 
made by other steps, since if they modify something unexpected, that can change 
the semantics of the feature (pod template support).
   
   Wouldn't most of those unexpected changes come from the unit tests of the 
individual steps? Granted this test can catch when a change in one step impacts 
behavior in another step, which is important. Given that this isn't changing 
prior code I'm fine with leaving this as-is and addressing again later if it 
becomes a problem.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Put all feature-related code in the feature step itself
> ---
>
> Key: SPARK-25877
> URL: https://issues.apache.org/jira/browse/SPARK-25877
> 

[jira] [Commented] (SPARK-25877) Put all feature-related code in the feature step itself

2018-12-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16718071#comment-16718071
 ] 

ASF GitHub Bot commented on SPARK-25877:


vanzin commented on a change in pull request #23220: [SPARK-25877][k8s] Move 
all feature logic to feature classes.
URL: https://github.com/apache/spark/pull/23220#discussion_r240805992
 
 

 ##
 File path: 
resource-managers/kubernetes/core/src/test/scala/org/apache/spark/deploy/k8s/PodBuilderSuite.scala
 ##
 @@ -0,0 +1,177 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s
+
+import java.io.File
+
+import io.fabric8.kubernetes.api.model.{Config => _, _}
+import io.fabric8.kubernetes.client.KubernetesClient
+import io.fabric8.kubernetes.client.dsl.{MixedOperation, PodResource}
+import org.mockito.Matchers.any
+import org.mockito.Mockito.{mock, never, verify, when}
+import scala.collection.JavaConverters._
+
+import org.apache.spark.{SparkConf, SparkException, SparkFunSuite}
+import org.apache.spark.deploy.k8s._
+import org.apache.spark.internal.config.ConfigEntry
+
+abstract class PodBuilderSuite extends SparkFunSuite {
+
+  protected def templateFileConf: ConfigEntry[_]
+
+  protected def buildPod(sparkConf: SparkConf, client: KubernetesClient): 
SparkPod
+
+  private val baseConf = new SparkConf(false)
+.set(Config.CONTAINER_IMAGE, "spark-executor:latest")
+
+  test("use empty initial pod if template is not specified") {
+val client = mock(classOf[KubernetesClient])
+buildPod(baseConf.clone(), client)
+verify(client, never()).pods()
+  }
+
+  test("load pod template if specified") {
+val client = mockKubernetesClient()
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val pod = buildPod(sparkConf, client)
+verifyPod(pod)
+  }
+
+  test("complain about misconfigured pod template") {
+val client = mockKubernetesClient(
+  new PodBuilder()
+.withNewMetadata()
+.addToLabels("test-label-key", "test-label-value")
+.endMetadata()
+.build())
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val exception = intercept[SparkException] {
+  buildPod(sparkConf, client)
+}
+assert(exception.getMessage.contains("Could not load pod from template 
file."))
+  }
+
+  private def mockKubernetesClient(pod: Pod = podWithSupportedFeatures()): 
KubernetesClient = {
+val kubernetesClient = mock(classOf[KubernetesClient])
+val pods =
+  mock(classOf[MixedOperation[Pod, PodList, DoneablePod, PodResource[Pod, 
DoneablePod]]])
+val podResource = mock(classOf[PodResource[Pod, DoneablePod]])
+when(kubernetesClient.pods()).thenReturn(pods)
+when(pods.load(any(classOf[File]))).thenReturn(podResource)
+when(podResource.get()).thenReturn(pod)
+kubernetesClient
+  }
+
+  private def verifyPod(pod: SparkPod): Unit = {
+val metadata = pod.pod.getMetadata
+assert(metadata.getLabels.containsKey("test-label-key"))
+assert(metadata.getAnnotations.containsKey("test-annotation-key"))
+assert(metadata.getNamespace === "namespace")
+assert(metadata.getOwnerReferences.asScala.exists(_.getName == 
"owner-reference"))
+val spec = pod.pod.getSpec
+assert(!spec.getContainers.asScala.exists(_.getName == 
"executor-container"))
+assert(spec.getDnsPolicy === "dns-policy")
+assert(spec.getHostAliases.asScala.exists(_.getHostnames.asScala.exists(_ 
== "hostname")))
 
 Review comment:
   I'm not modifying this code, just moving it from its previous location.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Put all feature-related code in the feature step itself
> ---
>
> Key: SPARK-25877
> URL: 

[jira] [Commented] (SPARK-25877) Put all feature-related code in the feature step itself

2018-12-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16718070#comment-16718070
 ] 

ASF GitHub Bot commented on SPARK-25877:


vanzin commented on a change in pull request #23220: [SPARK-25877][k8s] Move 
all feature logic to feature classes.
URL: https://github.com/apache/spark/pull/23220#discussion_r240805965
 
 

 ##
 File path: 
resource-managers/kubernetes/core/src/test/scala/org/apache/spark/deploy/k8s/PodBuilderSuite.scala
 ##
 @@ -0,0 +1,177 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s
+
+import java.io.File
+
+import io.fabric8.kubernetes.api.model.{Config => _, _}
+import io.fabric8.kubernetes.client.KubernetesClient
+import io.fabric8.kubernetes.client.dsl.{MixedOperation, PodResource}
+import org.mockito.Matchers.any
+import org.mockito.Mockito.{mock, never, verify, when}
+import scala.collection.JavaConverters._
+
+import org.apache.spark.{SparkConf, SparkException, SparkFunSuite}
+import org.apache.spark.deploy.k8s._
+import org.apache.spark.internal.config.ConfigEntry
+
+abstract class PodBuilderSuite extends SparkFunSuite {
+
+  protected def templateFileConf: ConfigEntry[_]
+
+  protected def buildPod(sparkConf: SparkConf, client: KubernetesClient): 
SparkPod
+
+  private val baseConf = new SparkConf(false)
+.set(Config.CONTAINER_IMAGE, "spark-executor:latest")
+
+  test("use empty initial pod if template is not specified") {
+val client = mock(classOf[KubernetesClient])
+buildPod(baseConf.clone(), client)
+verify(client, never()).pods()
+  }
+
+  test("load pod template if specified") {
+val client = mockKubernetesClient()
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val pod = buildPod(sparkConf, client)
+verifyPod(pod)
+  }
+
+  test("complain about misconfigured pod template") {
+val client = mockKubernetesClient(
+  new PodBuilder()
+.withNewMetadata()
+.addToLabels("test-label-key", "test-label-value")
+.endMetadata()
+.build())
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val exception = intercept[SparkException] {
+  buildPod(sparkConf, client)
+}
+assert(exception.getMessage.contains("Could not load pod from template 
file."))
+  }
+
+  private def mockKubernetesClient(pod: Pod = podWithSupportedFeatures()): 
KubernetesClient = {
+val kubernetesClient = mock(classOf[KubernetesClient])
+val pods =
+  mock(classOf[MixedOperation[Pod, PodList, DoneablePod, PodResource[Pod, 
DoneablePod]]])
+val podResource = mock(classOf[PodResource[Pod, DoneablePod]])
+when(kubernetesClient.pods()).thenReturn(pods)
+when(pods.load(any(classOf[File]))).thenReturn(podResource)
+when(podResource.get()).thenReturn(pod)
+kubernetesClient
+  }
+
+  private def verifyPod(pod: SparkPod): Unit = {
+val metadata = pod.pod.getMetadata
+assert(metadata.getLabels.containsKey("test-label-key"))
+assert(metadata.getAnnotations.containsKey("test-annotation-key"))
+assert(metadata.getNamespace === "namespace")
+assert(metadata.getOwnerReferences.asScala.exists(_.getName == 
"owner-reference"))
+val spec = pod.pod.getSpec
+assert(!spec.getContainers.asScala.exists(_.getName == 
"executor-container"))
 
 Review comment:
   I'm not modifying this code, just moving it from its previous location.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Put all feature-related code in the feature step itself
> ---
>
> Key: SPARK-25877
> URL: https://issues.apache.org/jira/browse/SPARK-25877
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes
>Affects Versions: 2.4.0
>

[jira] [Commented] (SPARK-25877) Put all feature-related code in the feature step itself

2018-12-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16718067#comment-16718067
 ] 

ASF GitHub Bot commented on SPARK-25877:


vanzin commented on a change in pull request #23220: [SPARK-25877][k8s] Move 
all feature logic to feature classes.
URL: https://github.com/apache/spark/pull/23220#discussion_r240805744
 
 

 ##
 File path: 
resource-managers/kubernetes/core/src/test/scala/org/apache/spark/deploy/k8s/PodBuilderSuite.scala
 ##
 @@ -0,0 +1,177 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s
+
+import java.io.File
+
+import io.fabric8.kubernetes.api.model.{Config => _, _}
+import io.fabric8.kubernetes.client.KubernetesClient
+import io.fabric8.kubernetes.client.dsl.{MixedOperation, PodResource}
+import org.mockito.Matchers.any
+import org.mockito.Mockito.{mock, never, verify, when}
+import scala.collection.JavaConverters._
+
+import org.apache.spark.{SparkConf, SparkException, SparkFunSuite}
+import org.apache.spark.deploy.k8s._
+import org.apache.spark.internal.config.ConfigEntry
+
+abstract class PodBuilderSuite extends SparkFunSuite {
+
+  protected def templateFileConf: ConfigEntry[_]
+
+  protected def buildPod(sparkConf: SparkConf, client: KubernetesClient): 
SparkPod
+
+  private val baseConf = new SparkConf(false)
+.set(Config.CONTAINER_IMAGE, "spark-executor:latest")
+
+  test("use empty initial pod if template is not specified") {
+val client = mock(classOf[KubernetesClient])
+buildPod(baseConf.clone(), client)
+verify(client, never()).pods()
+  }
+
+  test("load pod template if specified") {
+val client = mockKubernetesClient()
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val pod = buildPod(sparkConf, client)
+verifyPod(pod)
+  }
+
+  test("complain about misconfigured pod template") {
+val client = mockKubernetesClient(
+  new PodBuilder()
+.withNewMetadata()
+.addToLabels("test-label-key", "test-label-value")
+.endMetadata()
+.build())
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val exception = intercept[SparkException] {
+  buildPod(sparkConf, client)
+}
+assert(exception.getMessage.contains("Could not load pod from template 
file."))
+  }
+
+  private def mockKubernetesClient(pod: Pod = podWithSupportedFeatures()): 
KubernetesClient = {
+val kubernetesClient = mock(classOf[KubernetesClient])
+val pods =
+  mock(classOf[MixedOperation[Pod, PodList, DoneablePod, PodResource[Pod, 
DoneablePod]]])
+val podResource = mock(classOf[PodResource[Pod, DoneablePod]])
+when(kubernetesClient.pods()).thenReturn(pods)
+when(pods.load(any(classOf[File]))).thenReturn(podResource)
+when(podResource.get()).thenReturn(pod)
+kubernetesClient
+  }
+
+  private def verifyPod(pod: SparkPod): Unit = {
 
 Review comment:
   I actually did not write this test. I copy & pasted it with zero 
modifications from the previous class, and I'd prefer to keep it that way.
   
   That's also an argument for *not* restoring the mocks, which would go 
against what this change is doing. This test should account for modifications 
made by other steps, since if they modify something unexpected, that can change 
the semantics of the feature (pod template support).


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Put all feature-related code in the feature step itself
> ---
>
> Key: SPARK-25877
> URL: https://issues.apache.org/jira/browse/SPARK-25877
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes
>Affects Versions: 2.4.0
>Reporter: Marcelo Vanzin
>Priority: Major
>
> This is a 

[jira] [Commented] (SPARK-25877) Put all feature-related code in the feature step itself

2018-12-10 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16716022#comment-16716022
 ] 

ASF GitHub Bot commented on SPARK-25877:


mccheah commented on a change in pull request #23220: [SPARK-25877][k8s] Move 
all feature logic to feature classes.
URL: https://github.com/apache/spark/pull/23220#discussion_r240454265
 
 

 ##
 File path: 
resource-managers/kubernetes/core/src/test/scala/org/apache/spark/deploy/k8s/PodBuilderSuite.scala
 ##
 @@ -0,0 +1,177 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s
+
+import java.io.File
+
+import io.fabric8.kubernetes.api.model.{Config => _, _}
+import io.fabric8.kubernetes.client.KubernetesClient
+import io.fabric8.kubernetes.client.dsl.{MixedOperation, PodResource}
+import org.mockito.Matchers.any
+import org.mockito.Mockito.{mock, never, verify, when}
+import scala.collection.JavaConverters._
+
+import org.apache.spark.{SparkConf, SparkException, SparkFunSuite}
+import org.apache.spark.deploy.k8s._
+import org.apache.spark.internal.config.ConfigEntry
+
+abstract class PodBuilderSuite extends SparkFunSuite {
+
+  protected def templateFileConf: ConfigEntry[_]
+
+  protected def buildPod(sparkConf: SparkConf, client: KubernetesClient): 
SparkPod
+
+  private val baseConf = new SparkConf(false)
+.set(Config.CONTAINER_IMAGE, "spark-executor:latest")
+
+  test("use empty initial pod if template is not specified") {
+val client = mock(classOf[KubernetesClient])
+buildPod(baseConf.clone(), client)
+verify(client, never()).pods()
+  }
+
+  test("load pod template if specified") {
+val client = mockKubernetesClient()
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val pod = buildPod(sparkConf, client)
+verifyPod(pod)
+  }
+
+  test("complain about misconfigured pod template") {
+val client = mockKubernetesClient(
+  new PodBuilder()
+.withNewMetadata()
+.addToLabels("test-label-key", "test-label-value")
+.endMetadata()
+.build())
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val exception = intercept[SparkException] {
+  buildPod(sparkConf, client)
+}
+assert(exception.getMessage.contains("Could not load pod from template 
file."))
+  }
+
+  private def mockKubernetesClient(pod: Pod = podWithSupportedFeatures()): 
KubernetesClient = {
+val kubernetesClient = mock(classOf[KubernetesClient])
+val pods =
+  mock(classOf[MixedOperation[Pod, PodList, DoneablePod, PodResource[Pod, 
DoneablePod]]])
+val podResource = mock(classOf[PodResource[Pod, DoneablePod]])
+when(kubernetesClient.pods()).thenReturn(pods)
+when(pods.load(any(classOf[File]))).thenReturn(podResource)
+when(podResource.get()).thenReturn(pod)
+kubernetesClient
+  }
+
+  private def verifyPod(pod: SparkPod): Unit = {
+val metadata = pod.pod.getMetadata
+assert(metadata.getLabels.containsKey("test-label-key"))
+assert(metadata.getAnnotations.containsKey("test-annotation-key"))
+assert(metadata.getNamespace === "namespace")
+assert(metadata.getOwnerReferences.asScala.exists(_.getName == 
"owner-reference"))
+val spec = pod.pod.getSpec
+assert(!spec.getContainers.asScala.exists(_.getName == 
"executor-container"))
+assert(spec.getDnsPolicy === "dns-policy")
+assert(spec.getHostAliases.asScala.exists(_.getHostnames.asScala.exists(_ 
== "hostname")))
 
 Review comment:
   The second call to `exists` can be `contains` instead, so that we don't pass 
a function object that ignores the argument. Alternatively, both `exists` calls 
can be removed:
   
   ```
   spec.getHostAliases.asScala.flatMap(_.getHostnames).contains("hostname")
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Put all feature-related 

[jira] [Commented] (SPARK-25877) Put all feature-related code in the feature step itself

2018-12-10 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16716025#comment-16716025
 ] 

ASF GitHub Bot commented on SPARK-25877:


mccheah commented on a change in pull request #23220: [SPARK-25877][k8s] Move 
all feature logic to feature classes.
URL: https://github.com/apache/spark/pull/23220#discussion_r240454265
 
 

 ##
 File path: 
resource-managers/kubernetes/core/src/test/scala/org/apache/spark/deploy/k8s/PodBuilderSuite.scala
 ##
 @@ -0,0 +1,177 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s
+
+import java.io.File
+
+import io.fabric8.kubernetes.api.model.{Config => _, _}
+import io.fabric8.kubernetes.client.KubernetesClient
+import io.fabric8.kubernetes.client.dsl.{MixedOperation, PodResource}
+import org.mockito.Matchers.any
+import org.mockito.Mockito.{mock, never, verify, when}
+import scala.collection.JavaConverters._
+
+import org.apache.spark.{SparkConf, SparkException, SparkFunSuite}
+import org.apache.spark.deploy.k8s._
+import org.apache.spark.internal.config.ConfigEntry
+
+abstract class PodBuilderSuite extends SparkFunSuite {
+
+  protected def templateFileConf: ConfigEntry[_]
+
+  protected def buildPod(sparkConf: SparkConf, client: KubernetesClient): 
SparkPod
+
+  private val baseConf = new SparkConf(false)
+.set(Config.CONTAINER_IMAGE, "spark-executor:latest")
+
+  test("use empty initial pod if template is not specified") {
+val client = mock(classOf[KubernetesClient])
+buildPod(baseConf.clone(), client)
+verify(client, never()).pods()
+  }
+
+  test("load pod template if specified") {
+val client = mockKubernetesClient()
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val pod = buildPod(sparkConf, client)
+verifyPod(pod)
+  }
+
+  test("complain about misconfigured pod template") {
+val client = mockKubernetesClient(
+  new PodBuilder()
+.withNewMetadata()
+.addToLabels("test-label-key", "test-label-value")
+.endMetadata()
+.build())
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val exception = intercept[SparkException] {
+  buildPod(sparkConf, client)
+}
+assert(exception.getMessage.contains("Could not load pod from template 
file."))
+  }
+
+  private def mockKubernetesClient(pod: Pod = podWithSupportedFeatures()): 
KubernetesClient = {
+val kubernetesClient = mock(classOf[KubernetesClient])
+val pods =
+  mock(classOf[MixedOperation[Pod, PodList, DoneablePod, PodResource[Pod, 
DoneablePod]]])
+val podResource = mock(classOf[PodResource[Pod, DoneablePod]])
+when(kubernetesClient.pods()).thenReturn(pods)
+when(pods.load(any(classOf[File]))).thenReturn(podResource)
+when(podResource.get()).thenReturn(pod)
+kubernetesClient
+  }
+
+  private def verifyPod(pod: SparkPod): Unit = {
+val metadata = pod.pod.getMetadata
+assert(metadata.getLabels.containsKey("test-label-key"))
+assert(metadata.getAnnotations.containsKey("test-annotation-key"))
+assert(metadata.getNamespace === "namespace")
+assert(metadata.getOwnerReferences.asScala.exists(_.getName == 
"owner-reference"))
+val spec = pod.pod.getSpec
+assert(!spec.getContainers.asScala.exists(_.getName == 
"executor-container"))
+assert(spec.getDnsPolicy === "dns-policy")
+assert(spec.getHostAliases.asScala.exists(_.getHostnames.asScala.exists(_ 
== "hostname")))
 
 Review comment:
   Nit: The second call to `exists` can be `contains` instead, so that we don't 
pass a function object that ignores the argument. Alternatively, both `exists` 
calls can be removed:
   
   ```
   spec.getHostAliases.asScala.flatMap(_.getHostnames).contains("hostname")
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Put all 

[jira] [Commented] (SPARK-25877) Put all feature-related code in the feature step itself

2018-12-10 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16716026#comment-16716026
 ] 

ASF GitHub Bot commented on SPARK-25877:


mccheah commented on a change in pull request #23220: [SPARK-25877][k8s] Move 
all feature logic to feature classes.
URL: https://github.com/apache/spark/pull/23220#discussion_r240456173
 
 

 ##
 File path: 
resource-managers/kubernetes/core/src/test/scala/org/apache/spark/deploy/k8s/PodBuilderSuite.scala
 ##
 @@ -0,0 +1,177 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s
+
+import java.io.File
+
+import io.fabric8.kubernetes.api.model.{Config => _, _}
+import io.fabric8.kubernetes.client.KubernetesClient
+import io.fabric8.kubernetes.client.dsl.{MixedOperation, PodResource}
+import org.mockito.Matchers.any
+import org.mockito.Mockito.{mock, never, verify, when}
+import scala.collection.JavaConverters._
+
+import org.apache.spark.{SparkConf, SparkException, SparkFunSuite}
+import org.apache.spark.deploy.k8s._
+import org.apache.spark.internal.config.ConfigEntry
+
+abstract class PodBuilderSuite extends SparkFunSuite {
+
+  protected def templateFileConf: ConfigEntry[_]
+
+  protected def buildPod(sparkConf: SparkConf, client: KubernetesClient): 
SparkPod
+
+  private val baseConf = new SparkConf(false)
+.set(Config.CONTAINER_IMAGE, "spark-executor:latest")
+
+  test("use empty initial pod if template is not specified") {
+val client = mock(classOf[KubernetesClient])
+buildPod(baseConf.clone(), client)
+verify(client, never()).pods()
+  }
+
+  test("load pod template if specified") {
+val client = mockKubernetesClient()
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val pod = buildPod(sparkConf, client)
+verifyPod(pod)
+  }
+
+  test("complain about misconfigured pod template") {
+val client = mockKubernetesClient(
+  new PodBuilder()
+.withNewMetadata()
+.addToLabels("test-label-key", "test-label-value")
+.endMetadata()
+.build())
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val exception = intercept[SparkException] {
+  buildPod(sparkConf, client)
+}
+assert(exception.getMessage.contains("Could not load pod from template 
file."))
+  }
+
+  private def mockKubernetesClient(pod: Pod = podWithSupportedFeatures()): 
KubernetesClient = {
+val kubernetesClient = mock(classOf[KubernetesClient])
+val pods =
+  mock(classOf[MixedOperation[Pod, PodList, DoneablePod, PodResource[Pod, 
DoneablePod]]])
+val podResource = mock(classOf[PodResource[Pod, DoneablePod]])
+when(kubernetesClient.pods()).thenReturn(pods)
+when(pods.load(any(classOf[File]))).thenReturn(podResource)
+when(podResource.get()).thenReturn(pod)
+kubernetesClient
+  }
+
+  private def verifyPod(pod: SparkPod): Unit = {
 
 Review comment:
   Another factor of my concern about is that for each individual assertion, it 
is unclear which step the assertion is tied to. This reads a lot more like an 
ETE test than a unit test.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Put all feature-related code in the feature step itself
> ---
>
> Key: SPARK-25877
> URL: https://issues.apache.org/jira/browse/SPARK-25877
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes
>Affects Versions: 2.4.0
>Reporter: Marcelo Vanzin
>Priority: Major
>
> This is a child task of SPARK-25874. It covers having all the code related to 
> features in the feature steps themselves, including logic about whether a 
> step should be applied or not.
> Please refer to the parent bug for further details.



--
This message 

[jira] [Commented] (SPARK-25877) Put all feature-related code in the feature step itself

2018-12-10 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16716024#comment-16716024
 ] 

ASF GitHub Bot commented on SPARK-25877:


mccheah commented on a change in pull request #23220: [SPARK-25877][k8s] Move 
all feature logic to feature classes.
URL: https://github.com/apache/spark/pull/23220#discussion_r240454417
 
 

 ##
 File path: 
resource-managers/kubernetes/core/src/test/scala/org/apache/spark/deploy/k8s/PodBuilderSuite.scala
 ##
 @@ -0,0 +1,177 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s
+
+import java.io.File
+
+import io.fabric8.kubernetes.api.model.{Config => _, _}
+import io.fabric8.kubernetes.client.KubernetesClient
+import io.fabric8.kubernetes.client.dsl.{MixedOperation, PodResource}
+import org.mockito.Matchers.any
+import org.mockito.Mockito.{mock, never, verify, when}
+import scala.collection.JavaConverters._
+
+import org.apache.spark.{SparkConf, SparkException, SparkFunSuite}
+import org.apache.spark.deploy.k8s._
+import org.apache.spark.internal.config.ConfigEntry
+
+abstract class PodBuilderSuite extends SparkFunSuite {
+
+  protected def templateFileConf: ConfigEntry[_]
+
+  protected def buildPod(sparkConf: SparkConf, client: KubernetesClient): 
SparkPod
+
+  private val baseConf = new SparkConf(false)
+.set(Config.CONTAINER_IMAGE, "spark-executor:latest")
+
+  test("use empty initial pod if template is not specified") {
+val client = mock(classOf[KubernetesClient])
+buildPod(baseConf.clone(), client)
+verify(client, never()).pods()
+  }
+
+  test("load pod template if specified") {
+val client = mockKubernetesClient()
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val pod = buildPod(sparkConf, client)
+verifyPod(pod)
+  }
+
+  test("complain about misconfigured pod template") {
+val client = mockKubernetesClient(
+  new PodBuilder()
+.withNewMetadata()
+.addToLabels("test-label-key", "test-label-value")
+.endMetadata()
+.build())
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val exception = intercept[SparkException] {
+  buildPod(sparkConf, client)
+}
+assert(exception.getMessage.contains("Could not load pod from template 
file."))
+  }
+
+  private def mockKubernetesClient(pod: Pod = podWithSupportedFeatures()): 
KubernetesClient = {
+val kubernetesClient = mock(classOf[KubernetesClient])
+val pods =
+  mock(classOf[MixedOperation[Pod, PodList, DoneablePod, PodResource[Pod, 
DoneablePod]]])
+val podResource = mock(classOf[PodResource[Pod, DoneablePod]])
+when(kubernetesClient.pods()).thenReturn(pods)
+when(pods.load(any(classOf[File]))).thenReturn(podResource)
+when(podResource.get()).thenReturn(pod)
+kubernetesClient
+  }
+
+  private def verifyPod(pod: SparkPod): Unit = {
+val metadata = pod.pod.getMetadata
+assert(metadata.getLabels.containsKey("test-label-key"))
+assert(metadata.getAnnotations.containsKey("test-annotation-key"))
+assert(metadata.getNamespace === "namespace")
+assert(metadata.getOwnerReferences.asScala.exists(_.getName == 
"owner-reference"))
+val spec = pod.pod.getSpec
+assert(!spec.getContainers.asScala.exists(_.getName == 
"executor-container"))
 
 Review comment:
   Nit: Use `.asScala.map(_.getName).contains("executor-container")`. Similar 
changes come up throughout this test.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Put all feature-related code in the feature step itself
> ---
>
> Key: SPARK-25877
> URL: https://issues.apache.org/jira/browse/SPARK-25877
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes
> 

[jira] [Commented] (SPARK-25877) Put all feature-related code in the feature step itself

2018-12-10 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16716023#comment-16716023
 ] 

ASF GitHub Bot commented on SPARK-25877:


mccheah commented on a change in pull request #23220: [SPARK-25877][k8s] Move 
all feature logic to feature classes.
URL: https://github.com/apache/spark/pull/23220#discussion_r240454863
 
 

 ##
 File path: 
resource-managers/kubernetes/core/src/test/scala/org/apache/spark/deploy/k8s/PodBuilderSuite.scala
 ##
 @@ -0,0 +1,177 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s
+
+import java.io.File
+
+import io.fabric8.kubernetes.api.model.{Config => _, _}
+import io.fabric8.kubernetes.client.KubernetesClient
+import io.fabric8.kubernetes.client.dsl.{MixedOperation, PodResource}
+import org.mockito.Matchers.any
+import org.mockito.Mockito.{mock, never, verify, when}
+import scala.collection.JavaConverters._
+
+import org.apache.spark.{SparkConf, SparkException, SparkFunSuite}
+import org.apache.spark.deploy.k8s._
+import org.apache.spark.internal.config.ConfigEntry
+
+abstract class PodBuilderSuite extends SparkFunSuite {
+
+  protected def templateFileConf: ConfigEntry[_]
+
+  protected def buildPod(sparkConf: SparkConf, client: KubernetesClient): 
SparkPod
+
+  private val baseConf = new SparkConf(false)
+.set(Config.CONTAINER_IMAGE, "spark-executor:latest")
+
+  test("use empty initial pod if template is not specified") {
+val client = mock(classOf[KubernetesClient])
+buildPod(baseConf.clone(), client)
+verify(client, never()).pods()
+  }
+
+  test("load pod template if specified") {
+val client = mockKubernetesClient()
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val pod = buildPod(sparkConf, client)
+verifyPod(pod)
+  }
+
+  test("complain about misconfigured pod template") {
+val client = mockKubernetesClient(
+  new PodBuilder()
+.withNewMetadata()
+.addToLabels("test-label-key", "test-label-value")
+.endMetadata()
+.build())
+val sparkConf = baseConf.clone().set(templateFileConf.key, 
"template-file.yaml")
+val exception = intercept[SparkException] {
+  buildPod(sparkConf, client)
+}
+assert(exception.getMessage.contains("Could not load pod from template 
file."))
+  }
+
+  private def mockKubernetesClient(pod: Pod = podWithSupportedFeatures()): 
KubernetesClient = {
+val kubernetesClient = mock(classOf[KubernetesClient])
+val pods =
+  mock(classOf[MixedOperation[Pod, PodList, DoneablePod, PodResource[Pod, 
DoneablePod]]])
+val podResource = mock(classOf[PodResource[Pod, DoneablePod]])
+when(kubernetesClient.pods()).thenReturn(pods)
+when(pods.load(any(classOf[File]))).thenReturn(podResource)
+when(podResource.get()).thenReturn(pod)
+kubernetesClient
+  }
+
+  private def verifyPod(pod: SparkPod): Unit = {
 
 Review comment:
   The number of things this test checks is remarkable, and it is very much 
possible to accidentally omit checking the application of a specific feature 
when a new one is added for either the driver or executor. This is why we had 
the overridable feature steps in the original incarnation of these tests. Not 
mocking the substeps leads us to need to check that some specific aspect of 
each step has been applied. Can we go back to mocking the different steps so 
that this test can be more easily modified when we add more features? Or else 
can we abstract away the idea that these steps are applied without this test 
itself needing to know what the step itself actually does?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Put all feature-related code in the feature step itself
> ---
>
> Key: SPARK-25877
> URL: 

[jira] [Commented] (SPARK-25877) Put all feature-related code in the feature step itself

2018-12-10 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16715822#comment-16715822
 ] 

ASF GitHub Bot commented on SPARK-25877:


vanzin commented on issue #23220: [SPARK-25877][k8s] Move all feature logic to 
feature classes.
URL: https://github.com/apache/spark/pull/23220#issuecomment-446022089
 
 
   So, anybody interested in reviewing this?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Put all feature-related code in the feature step itself
> ---
>
> Key: SPARK-25877
> URL: https://issues.apache.org/jira/browse/SPARK-25877
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes
>Affects Versions: 2.4.0
>Reporter: Marcelo Vanzin
>Priority: Major
>
> This is a child task of SPARK-25874. It covers having all the code related to 
> features in the feature steps themselves, including logic about whether a 
> step should be applied or not.
> Please refer to the parent bug for further details.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25877) Put all feature-related code in the feature step itself

2018-12-04 Thread Apache Spark (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16709429#comment-16709429
 ] 

Apache Spark commented on SPARK-25877:
--

User 'vanzin' has created a pull request for this issue:
https://github.com/apache/spark/pull/23220

> Put all feature-related code in the feature step itself
> ---
>
> Key: SPARK-25877
> URL: https://issues.apache.org/jira/browse/SPARK-25877
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes
>Affects Versions: 2.4.0
>Reporter: Marcelo Vanzin
>Priority: Major
>
> This is a child task of SPARK-25874. It covers having all the code related to 
> features in the feature steps themselves, including logic about whether a 
> step should be applied or not.
> Please refer to the parent bug for further details.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25877) Put all feature-related code in the feature step itself

2018-12-04 Thread Apache Spark (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16709428#comment-16709428
 ] 

Apache Spark commented on SPARK-25877:
--

User 'vanzin' has created a pull request for this issue:
https://github.com/apache/spark/pull/23220

> Put all feature-related code in the feature step itself
> ---
>
> Key: SPARK-25877
> URL: https://issues.apache.org/jira/browse/SPARK-25877
> Project: Spark
>  Issue Type: Improvement
>  Components: Kubernetes
>Affects Versions: 2.4.0
>Reporter: Marcelo Vanzin
>Priority: Major
>
> This is a child task of SPARK-25874. It covers having all the code related to 
> features in the feature steps themselves, including logic about whether a 
> step should be applied or not.
> Please refer to the parent bug for further details.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org