Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5514#discussion_r28437640
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -183,36 +187,36 @@ class SparkConf(loadDefaults: Boolean) extends 
Cloneable with Logging {
         Utils.timeStringAsSeconds(get(key))
       }
     
    -  /** 
    -   * Get a time parameter as seconds, falling back to a default if not 
set. If no 
    +  /**
    +   * Get a time parameter as seconds, falling back to a default if not 
set. If no
        * suffix is provided then seconds are assumed.
    -   * 
    +   *
        */
       def getTimeAsSeconds(key: String, defaultValue: String): Long = {
         Utils.timeStringAsSeconds(get(key, defaultValue))
       }
     
    -  /** 
    -   * Get a time parameter as milliseconds; throws a NoSuchElementException 
if it's not set. If no 
    -   * suffix is provided then milliseconds are assumed. 
    +  /**
    +   * Get a time parameter as milliseconds; throws a NoSuchElementException 
if it's not set. If no
    +   * suffix is provided then milliseconds are assumed.
        * @throws NoSuchElementException
        */
       def getTimeAsMs(key: String): Long = {
         Utils.timeStringAsMs(get(key))
       }
     
    -  /** 
    -   * Get a time parameter as milliseconds, falling back to a default if 
not set. If no 
    -   * suffix is provided then milliseconds are assumed. 
    +  /**
    +   * Get a time parameter as milliseconds, falling back to a default if 
not set. If no
    +   * suffix is provided then milliseconds are assumed.
        */
       def getTimeAsMs(key: String, defaultValue: String): Long = {
         Utils.timeStringAsMs(get(key, defaultValue))
       }
    -  
    +
     
       /** Get a parameter as an Option */
       def getOption(key: String): Option[String] = {
    -    Option(settings.get(key))
    +    Option(settings.get(key)).orElse(getDeprecatedConfig(key, this))
    --- End diff --
    
    I think that would be more complicated. Because then you'd also have to 
figure out that when the user sets, for example, 
`spark.history.fs.updateInterval`, you should also set 
`spark.history.updateInterval`, and that information is not readily available 
in the current data structures. Also, it would use a tiny bit more memory for 
no particular gain.
    
    It's doable if I reorganize things, but I don't think it's worth it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to