Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4214#discussion_r25106244
  
    --- Diff: 
core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala ---
    @@ -43,9 +47,33 @@ private[history] class FsHistoryProvider(conf: 
SparkConf) extends ApplicationHis
     
       private val NOT_STARTED = "<Not Started>"
     
    +  // One day
    +  private val DEFAULT_SPARK_HISTORY_FS_CLEANER_INTERVAL_S = Duration(1, 
TimeUnit.DAYS).toSeconds
    +
    +  // One week
    +  private val DEFAULT_SPARK_HISTORY_FS_MAXAGE_S = Duration(7, 
TimeUnit.DAYS).toSeconds
    +
    +  private def warnUpdateInterval(key: String, value: String): String = {
    +    logWarning(s"Using $key to set interval " +
    +      "between each check for event log updates is deprecated, " +
    +      "please use spark.history.fs.update.interval.seconds instead.")
    +    value
    +  }
    +
    +  private def getDeprecatedConfig(conf: SparkConf, key: String): 
Option[String] = {
    +    conf.getOption(key).map(warnUpdateInterval(key, _))
    +  }
    --- End diff --
    
    can you do all of these through `SparkConf.deprecatedConfigs` instead of 
doing it here? You may need to rebase to master to get those changes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to