Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20761#discussion_r220022949
  
    --- Diff: 
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ResourceTypeValidator.scala
 ---
    @@ -0,0 +1,106 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.deploy.yarn
    +
    +import scala.collection.mutable
    +
    +import org.apache.spark.{SparkConf, SparkException}
    +import org.apache.spark.deploy.yarn.config._
    +
    +private object ResourceTypeValidator {
    +  private val ERROR_PREFIX: String = "Error: "
    +  private val POSSIBLE_RESOURCE_DEFINITIONS = Seq[(String, String)](
    +    ("spark.yarn.am.memory", YARN_AM_RESOURCE_TYPES_PREFIX + "memory"),
    +    ("spark.yarn.am.cores", YARN_AM_RESOURCE_TYPES_PREFIX + "cores"),
    +    ("spark.driver.memory", YARN_DRIVER_RESOURCE_TYPES_PREFIX + "memory"),
    +    ("spark.driver.cores", YARN_DRIVER_RESOURCE_TYPES_PREFIX + "cores"),
    +    ("spark.executor.memory", YARN_EXECUTOR_RESOURCE_TYPES_PREFIX + 
"memory"),
    +    ("spark.executor.cores", YARN_EXECUTOR_RESOURCE_TYPES_PREFIX + 
"cores"))
    +
    +  /**
    +   * Validates sparkConf and throws a SparkException if a standard 
resource (memory or cores)
    +   * is defined with the property spark.yarn.x.resource.y<br>
    +   *
    +   * Example of an invalid config:<br>
    +   * - spark.yarn.driver.resource.memory=2g<br>
    +   *
    +   * Please note that if multiple resources are defined like described 
above,
    +   * the error messages will be concatenated.<br>
    +   * Example of such a config:<br>
    +   * - spark.yarn.driver.resource.memory=2g<br>
    +   * - spark.yarn.executor.resource.cores=2<br>
    +   * Then the following two error messages will be printed:<br>
    +   * - "memory cannot be requested with config 
spark.yarn.driver.resource.memory,
    +   * please use config spark.driver.memory instead!<br>
    +   * - "cores cannot be requested with config 
spark.yarn.executor.resource.cores,
    +   * please use config spark.executor.cores instead!<br>
    +   *
    +   * @param sparkConf
    +   */
    +  def validateResources(sparkConf: SparkConf): Unit = {
    +    val overallErrorMessage = new mutable.StringBuilder()
    +    POSSIBLE_RESOURCE_DEFINITIONS.foreach { resdef =>
    +      val customResources: Map[String, String] =
    --- End diff --
    
    I'm going to ask the same question I've asked before: why is this so 
complicated?
    
    Why do you have to parse the config into a map for every key you're 
checking, instead of just checking if the illegal key exists in the 
configuration?
    
    e.g. with the names I suggested above:
    
    ```
    if (sparkConf.contains(resourceRequest)) {
      errorMessage.append(s"Don't use $resourceRequest; use $sparkName 
instead.")
    }
    ```
    
    That's 3 lines of code, has all the info you need, and doesn't parse the 
config over an over again on each iteration. And is also what I've been 
suggesting since my first review of this class.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to