Sean Owen created SPARK-3811: -------------------------------- Summary: More robust / standard Utils.deleteRecursively, Utils.createTempDir Key: SPARK-3811 URL: https://issues.apache.org/jira/browse/SPARK-3811 Project: Spark Issue Type: Improvement Components: Spark Core Reporter: Sean Owen Priority: Minor
I noticed a few issues with how temp directories are created and deleted: *Minor* * Guava's {{Files.createTempDir()}} plus {{File.deleteOnExit()}} is used in many tests to make a temp dir, but {{Utils.createTempDir()}} seems to be the standard Spark mechanism * Call to {{File.deleteOnExit()}} could be pushed into {{Utils.createTempDir()}} as well, along with this replacement. * _I messed up the message in an exception in {{Utils}} in SPARK-3794; fixed here_ *Bit Less Minor* * {{Utils.deleteRecursively()}} fails immediately if any {{IOException}} occurs, instead of trying to delete any remaining files and subdirectories. I've observed this leave temp dirs around. I suggest changing it to continue in the face of an exception and throw one of the possibly several exceptions that occur at the end. * {{Utils.createTempDir()}} will add a JVM shutdown hook every time the method is called. Even if the subdir is the parent of another parent dir, since this check is inside the hook. However {{Utils}} manages a set of all dirs to delete on shutdown already, called {{shutdownDeletePaths}}. A single hook can be registered to delete all of these on exit. This is how Tachyon temp paths are cleaned up in {{TachyonBlockManager}}. I noticed a few other things that might be changed but wanted to ask first: * Shouldn't the set of dirs to delete be {{File}}, not just {{String}} paths? * {{Utils}} manages the set of {{TachyonFile}} that have been registered for deletion, but the shutdown hook is managed in {{TachyonBlockManager}}. Should this logic not live together, and not in {{Utils}}? it's more specific to Tachyon, and looks a slight bit odd to import in such a generic place. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org