srowen commented on a change in pull request #24796: [SPARK-27900][CORE] Add
uncaught exception handler to the driver
URL: https://github.com/apache/spark/pull/24796#discussion_r290521928
##########
File path: core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala
##########
@@ -204,6 +204,11 @@ private [util] class SparkShutdownHookManager {
hooks.synchronized { hooks.remove(ref) }
}
+ def clear(): Unit = {
Review comment:
Is it a deadlock? I understand just exiting anyway from the driver if an
uncaught exception happens, just not so clear why one would remove the hooks.
If they execute, good, or are you saying they're the issue? if so, is there a
fix for the hook instead? I'm not sure it's better to not execute them.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]