Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/5696#discussion_r29095078
--- Diff:
yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala ---
@@ -95,14 +95,8 @@ private[spark] class ApplicationMaster(
val fs = FileSystem.get(yarnConf)
- Utils.addShutdownHook { () =>
- // If the SparkContext is still registered, shut it down as a best
case effort in case
- // users do not call sc.stop or do System.exit().
- val sc = sparkContextRef.get()
- if (sc != null) {
- logInfo("Invoking sc stop from shutdown hook")
- sc.stop()
- }
+ // This shutdown hook should run *after* the SparkContext is shut
down.
+ Utils.addShutdownHook(Utils.SPARK_CONTEXT_SHUTDOWN_PRIORITY - 1) {
() =>
--- End diff --
On the grounds that the YARN AM already tries to do this, I think this is
probably a good change to implement for all modes. I think it's all too common
that user programs exit without calling `stop()` and this does affect things
like finding logs.
Calling `stop()` is idempotent right? so the extra hook shouldn't hurt
anything even for well-behaved programs.
You have some spurious white-space changes in this PR. Not really worth
fixing but might tell your IDE not to change trailing whitespace.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]