Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/2670#discussion_r18473624 --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala --- @@ -666,15 +673,27 @@ private[spark] object Utils extends Logging { */ def deleteRecursively(file: File) { if (file != null) { - if (file.isDirectory() && !isSymlink(file)) { - for (child <- listFilesSafely(file)) { - deleteRecursively(child) + try { + if (file.isDirectory && !isSymlink(file)) { + var savedIOException: IOException = null + for (child <- listFilesSafely(file)) { + try { + deleteRecursively(child) + } catch { + // In case of multiple exceptions, only last one will be thrown + case ioe: IOException => savedIOException = ioe + } + } + if (savedIOException != null) { + throw savedIOException + } } - } - if (!file.delete()) { - // Delete can also fail if the file simply did not exist - if (file.exists()) { - throw new IOException("Failed to delete: " + file.getAbsolutePath) + } finally { --- End diff -- So, putting this in a `finally` feels weird. If the code fails to delete some child file, it will throw an exception; then it will execute this `finally` block for the parent directory, which will fail (because there are still children in it) and throw another exception. So maybe this should just be in the same block as the above, with no `try..finally`. That will make sure that the actual cause of the failure (deleting the child file) will be the one reported.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org