Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22174#discussion_r211727767
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
    @@ -1954,14 +1954,7 @@ class SQLConf extends Serializable with Logging {
             entry.valueConverter(defaultValue)
           }
         }
    -    Option(settings.get(key)).getOrElse {
    -      // If the key is not set, need to check whether the config entry is 
registered and is
    -      // a fallback conf, so that we can check its parent.
    -      sqlConfEntries.get(key) match {
    -        case e: FallbackConfigEntry[_] => getConfString(e.fallback.key, 
defaultValue)
    -        case _ => defaultValue
    -      }
    -    }
    +    Option(settings.get(key)).getOrElse(defaultValue)
    --- End diff --
    
    This is a behavior change. We already have a conf that uses this fallback 
config. See 
    ```
      val SQL_STRING_REDACTION_PATTERN =
        buildConf("spark.sql.redaction.string.regex")
          .doc("Regex to decide which parts of strings produced by Spark 
contain sensitive " +
            "information. When this regex matches a string part, that string 
part is replaced by a " +
            "dummy value. This is currently used to redact the output of SQL 
explain commands. " +
            "When this conf is not set, the value from 
`spark.redaction.string.regex` is used.")
          
.fallbackConf(org.apache.spark.internal.config.STRING_REDACTION_PATTERN)
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to