Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/2471#discussion_r18934524
--- Diff:
core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala ---
@@ -210,7 +220,44 @@ private[history] class FsHistoryProvider(conf:
SparkConf) extends ApplicationHis
applications = newApps
}
} catch {
- case t: Throwable => logError("Exception in checking for event log
updates", t)
+ case t: Exception => logError("Exception in checking for event log
updates", t)
+ }
+ }
+
+ /**
+ * Delete event logs from the log directory according to the clean
policy defined by the user.
+ */
+ private def cleanLogs() = {
+ try {
+ val logStatus = fs.listStatus(new Path(resolvedLogDir))
+ val logDirs = if (logStatus != null) logStatus.filter(_.isDir).toSeq
else Seq[FileStatus]()
+ val maxAge = conf.getLong("spark.history.fs.maxAge.seconds",
+ DEFAULT_SPARK_HISTORY_FS_MAXAGE_S) * 1000
+
+ val now = System.currentTimeMillis()
+
+ // Scan all logs from the log directory.
+ // Only directories older than now maxAge milliseconds mill will be
deleted
+ logDirs.foreach { dir =>
+ if (now - getModificationTime(dir) > maxAge) {
+ fs.delete(dir.getPath, true)
--- End diff --
You forgot to handle exceptions here (see my previous coments on the
subject).
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]