mingkangli-db commented on code in PR #48551:
URL: https://github.com/apache/spark/pull/48551#discussion_r1809344400
##########
core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala:
##########
@@ -36,6 +38,17 @@ private[spark] class SparkUncaughtExceptionHandler(val
exitOnUncaughtException:
val _ = SparkExitCode.OOM
}
+ // The maximum depth to search in the exception cause chain for a fatal
error,
+ // as defined by killOnFatalErrorDepth in Executor.scala.
+ //
+ // SPARK-50034: When this handler is called, there is a fatal error in the
cause chain within
+ // the specified depth. We should identify that fatal error and exit with the
+ // correct exit code.
+ private val killOnFatalErrorDepth: Int =
+ // At this point SparkEnv might be None
Review Comment:
When registering the uncaught exception handler on the `Master`
([code](https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/master/Master.scala#L1383-1384))
or `Worker`
([code](https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala#L981-L982)),
`SparkEnv` is not yet initialized, while it is initialized on the Executor, so
we have account for both cases.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]