Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/5491#discussion_r28297179
--- Diff:
core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala ---
@@ -273,35 +273,28 @@ private[history] class FsHistoryProvider(conf:
SparkConf) extends ApplicationHis
*/
private def cleanLogs(): Unit = {
try {
- val statusList = Option(fs.listStatus(new Path(logDir))).map(_.toSeq)
- .getOrElse(Seq[FileStatus]())
val maxAge = conf.getLong("spark.history.fs.cleaner.maxAge.seconds",
DEFAULT_SPARK_HISTORY_FS_MAXAGE_S) * 1000
val now = System.currentTimeMillis()
val appsToRetain = new mutable.LinkedHashMap[String,
FsApplicationHistoryInfo]()
+ // Scan all logs from the log directory.
+ // Only completed applications older than the specified max age will
be deleted.
applications.values.foreach { info =>
- if (now - info.lastUpdated <= maxAge) {
+ if (now - info.lastUpdated <= maxAge || !info.completed) {
appsToRetain += (info.id -> info)
+ } else {
--- End diff --
This `else` block needs to stay after line 297 (`applications =
appsToRetain`). That ensures that the code will make a best effort at only
deleting app data after they have been removed from the app list.
A user may still may be looking at an old version of the list, and thus may
still be able to click on an invalid link, but at least the HS's internal state
is consistent.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]