Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20785#discussion_r174968586
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -2434,7 +2434,8 @@ private[spark] object Utils extends Logging {
        */
       def getSparkOrYarnConfig(conf: SparkConf, key: String, default: String): 
String = {
         val sparkValue = conf.get(key, default)
    -    if (conf.get(SparkLauncher.SPARK_MASTER, null) == "yarn") {
    +    if (conf.get(SparkLauncher.SPARK_MASTER, null) == "yarn"
    --- End diff --
    
    No.
    
    The logic you want here is the equivalent of:
    
    ```
    if conf.contains(key)
      get spark conf
    elif is_running_on_yarn()
      get conf from yarn
    else
      return default
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to