Github user andrewor14 commented on a diff in the pull request:
https://github.com/apache/spark/pull/2471#discussion_r18107160
--- Diff:
core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala ---
@@ -34,10 +34,20 @@ private[history] class FsHistoryProvider(conf:
SparkConf) extends ApplicationHis
private val NOT_STARTED = "<Not Started>"
+ //one day
+ private val DEFAULT_SPARK_HISTORY_FS_CLEANER_INTERVAL_S = 1 * 24 * 60 *
60
+
+ //one week
+ private val DEFAULT_SPARK_HISTORY_FS_MAXAGE_S = 7 * 24 * 60 * 60
+
// Interval between each check for event log updates
private val UPDATE_INTERVAL_MS =
conf.getInt("spark.history.fs.updateInterval",
--- End diff --
Maybe we should make the naming of this config consistent by calling this
`spark.history.fs.update.interval.ms` (and deprecate the old one)? We don't
have to do that in this PR.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]