Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/18594#discussion_r128872073
--- Diff: core/src/main/scala/org/apache/spark/executor/Executor.scala ---
@@ -473,29 +473,36 @@ private[spark] class Executor(
// the default uncaught exception handler, which will terminate
the Executor.
logError(s"Exception in $taskName (TID $taskId)", t)
- // Collect latest accumulator values to report back to the driver
- val accums: Seq[AccumulatorV2[_, _]] =
- if (task != null) {
- task.metrics.setExecutorRunTime(System.currentTimeMillis() -
taskStart)
- task.metrics.setJvmGCTime(computeTotalGcTime() - startGCTime)
- task.collectAccumulatorUpdates(taskFailed = true)
- } else {
- Seq.empty
- }
+ // SPARK-20904: Do not report failure to driver if if happened
during shut down. Because
+ // libraries may set up shutdown hooks that race with running
tasks during shutdown,
+ // spurious failures may occur and can result in improper
accounting in the driver (e.g.
+ // the task failure would not be ignored if the shutdown
happened because of premption,
+ // instead of an app issue).
+ if (!ShutdownHookManager.inShutdown()) {
--- End diff --
Sure, I can add a log, but it's not guaranteed to be printed. During
shutdown the JVM can die at any moment (only shutdown hooks run to completion,
and this is not one of them)...
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]