Github user mccheah commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20669#discussion_r172677514
  
    --- Diff: 
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/submit/steps/DriverConfigPropertiesStep.scala
 ---
    @@ -0,0 +1,85 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.spark.deploy.k8s.submit.steps
    +
    +import java.io.StringWriter
    +import java.util.Properties
    +
    +import io.fabric8.kubernetes.api.model._
    +
    +import org.apache.spark.SparkConf
    +import org.apache.spark.deploy.k8s.Config._
    +import org.apache.spark.deploy.k8s.Constants._
    +import org.apache.spark.deploy.k8s.submit.KubernetesDriverSpec
    +
    +/**
    + * Create a config map with the driver configuration and attach it to the 
pod. This needs to
    + * come at the end of the driver configuration so that all modifications 
to the Spark config
    + * are reflected in the generated config map.
    + */
    +private[spark] class DriverConfigPropertiesStep(resourceNamePrefix: String)
    +    extends DriverConfigurationStep {
    +
    +  override def configureDriver(spec: KubernetesDriverSpec): 
KubernetesDriverSpec = {
    +    val configMapName = s"$resourceNamePrefix-driver-conf-map"
    +    val configMap = buildConfigMap(configMapName, spec.driverSparkConf)
    --- End diff --
    
    it's risky to pass the driver spark configuration this way. The way this is 
built, this step must be specifically placed last in the list of steps returned 
by the orchestrator. Ideally, the orchestrators should be able to return monads 
that can be applied in any order. But this is done in such a way that the 
driver spark configuration must be completely built in the spec before this 
step is run.
    
    I think this code shouldn't necessarily even live in a config step, but 
instead are run directly in the outer submission client. This is what we were 
doing before, when we just passed all of the driver Spark configurations as 
system properties to the main class. We can replace that instead with passing 
these values as `--conf` options. Thoughts?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to