Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/1482#issuecomment-54664325
This seems reasonable to me. /cc @andrewor14 for another pair of eyes.
To recap [some discussion on the
JIRA](https://issues.apache.org/jira/browse/SPARK-2491), the issue that this
addresses is a scenario where the Executor JVM is in the process of exiting due
to an uncaught exception and other shutdown hooks might have deleted files or
otherwise performed cleanup that causes other still-running tasks to fail.
These additional failures/errors are confusing when they appear in the log and
make it hard to find the real failure that caused the executor JVM to exit.
@witgo If I understand correctly, the problem here is that confusing
messages appear in the logs, not that the executor doesn't stop or doesn't
perform cleanup? If that's the case, can we edit the PR's title to
"[SPARK-2491] Don't handle uncaught exceptions from tasks that fail during
executor shutdown"?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]