Github user mccheah commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21067#discussion_r194837902
  
    --- Diff: 
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/KubernetesFeatureConfigStep.scala
 ---
    @@ -18,54 +18,87 @@ package org.apache.spark.deploy.k8s.features
     
     import io.fabric8.kubernetes.api.model.HasMetadata
     
    -import org.apache.spark.deploy.k8s.SparkPod
    +import org.apache.spark.deploy.k8s.{SparkExecutorPod, SparkJob}
     
     /**
    - * A collection of functions that together represent a "feature" in pods 
that are launched for
    + * A collection of functions that together represent a
    + * "feature" in jobs and pods that are launched for
      * Spark drivers and executors.
      */
     private[spark] trait KubernetesFeatureConfigStep {
     
       /**
    -   * Apply modifications on the given pod in accordance to this feature. 
This can include attaching
    +   * Apply modifications on the given job in accordance to this feature. 
This can include attaching
        * volumes, adding environment variables, and adding labels/annotations.
        * <p>
    -   * Note that we should return a SparkPod that keeps all of the 
properties of the passed SparkPod
    +   * Note that we should return a SparkJob that keeps all of the 
properties of the passed SparkJob
        * object. So this is correct:
        * <pre>
    -   * {@code val configuredPod = new PodBuilder(pod.pod)
    +   * {@code val configuredJob = new JobBuilder(job.job)
        *     .editSpec()
        *     ...
        *     .build()
    -   *   val configuredContainer = new ContainerBuilder(pod.container)
    +   *   val configuredContainer = new ContainerBuilder(job.container)
        *     ...
        *     .build()
    -   *   SparkPod(configuredPod, configuredContainer)
    +   *   SparkJob(configuredJob, configuredContainer)
        *  }
        * </pre>
        * This is incorrect:
        * <pre>
    -   * {@code val configuredPod = new PodBuilder() // Loses the original 
state
    +   * {@code val configuredJob = new JobBuilder() // Loses the original 
state
        *     .editSpec()
        *     ...
        *     .build()
        *   val configuredContainer = new ContainerBuilder() // Loses the 
original state
        *     ...
        *     .build()
    -   *   SparkPod(configuredPod, configuredContainer)
    +   *   SparkJob(configuredJob, configuredContainer)
        *  }
        * </pre>
        */
    -  def configurePod(pod: SparkPod): SparkPod
    -
    +  def configureJob(job: SparkJob): SparkJob = SparkJob.initialJob()
    + /**
    +  * Apply modifications on the given executor pod in
    +  * accordance to this feature. This can include attaching
    +  * volumes, adding environment variables, and adding labels/annotations.
    +  * <p>
    +  * Note that we should return a SparkExecutorPod that keeps all
    +  * of the properties of the passed SparkExecutor
    +  * object. So this is correct:
    +  * <pre>
    +  * {@code val configuredExecutorPod = new PodBuilder(pod.pod)
    +  *     .editSpec()
    +  *     ...
    +  *     .build()
    +  *   val configuredContainer = new ContainerBuilder(pod.container)
    +  *     ...
    +  *     .build()
    +  *   SparkExecutorPod(configuredPod, configuredContainer)
    +  *  }
    +  * </pre>
    +  * This is incorrect:
    +  * <pre>
    +  * {@code val configuredExecutorPod = new PodBuilder() // Loses the 
original state
    +  *     .editSpec()
    +  *     ...
    +  *     .build()
    +  *   val configuredContainer = new ContainerBuilder() // Loses the 
original state
    +  *     ...
    +  *     .build()
    +  *   SparkExecutorPod(configuredPod, configuredContainer)
    +  *  }
    +  * </pre>
    +  */
    +  def configureExecutorPod(pod: SparkExecutorPod): SparkExecutorPod = 
SparkExecutorPod.initialPod()
    --- End diff --
    
    The job has a pod template though right? We should just inject the pod into 
the job as the pod template.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to