Github user andrewor14 commented on a diff in the pull request:
https://github.com/apache/spark/pull/1222#discussion_r15477516
--- Diff:
core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala ---
@@ -110,14 +121,22 @@ private[history] class FsHistoryProvider(conf:
SparkConf) extends ApplicationHis
* Tries to reuse as much of the data already in memory as possible, by
not reading
* applications that haven't been updated since last time the logs were
checked.
*/
- private def checkForLogs() = {
+ private[history] def checkForLogs() = {
lastLogCheckTimeMs = getMonotonicTimeMs()
logDebug("Checking for logs. Time is now
%d.".format(lastLogCheckTimeMs))
try {
- val logStatus = fs.listStatus(new Path(logDir))
- val logDirs = if (logStatus != null) logStatus.filter(_.isDir).toSeq
else Seq[FileStatus]()
- val logInfos = logDirs.filter {
- dir => fs.isFile(new Path(dir.getPath(),
EventLoggingListener.APPLICATION_COMPLETE))
+ val matcher = EventLoggingListener.LOG_FILE_NAME_REGEX
+ val logInfos = fs.listStatus(new Path(logDir)).filter { entry =>
+ if (entry.isDir()) {
+ fs.exists(new Path(entry.getPath(), APPLICATION_COMPLETE))
+ } else {
+ try {
+ val matcher(_, _, version, codecName, inprogress) =
entry.getPath().getName()
+ inprogress == null
+ } catch {
+ case e: Exception => false
--- End diff --
We probably want to at least `logError` the exception so we don't quietly
swallow it
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---