GitHub user liancheng opened a pull request:
https://github.com/apache/spark/pull/2945
[SPARK-4091][Core] Don't delete spark local directories twice
Temporary local directories created by `DiskBlockManager` are asked to be
deleted at two places:
1. in `DiskBlockManager.addShutdownHook()`
2. in `DiskBlockManager.stop()`
Both of them are executed as shutdown hooks in parallel, and may cause
subtle race conditions that makes the 2nd deletion fail. In most cases, this
failure is harmless, but when it happens on Jenkins builds that runs Spark SQL
tests, it fails `CliSuite`.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/liancheng/spark spark-4091
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/2945.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #2945
----
commit cf6c24038b89042ba387492e6cea9d0335a75125
Author: Cheng Lian <[email protected]>
Date: 2014-10-26T17:33:42Z
Don't delete spark local dirs twice
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]