cloud-fan commented on code in PR #48551:
URL: https://github.com/apache/spark/pull/48551#discussion_r1814158526
##########
core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala:
##########
@@ -50,19 +63,31 @@ private[spark] class SparkUncaughtExceptionHandler(val
exitOnUncaughtException:
// We may have been called from a shutdown hook. If so, we must not call
System.exit().
// (If we do, we will deadlock.)
if (!ShutdownHookManager.inShutdown()) {
- exception match {
- case _: OutOfMemoryError =>
- System.exit(SparkExitCode.OOM)
- case e: SparkFatalException if
e.throwable.isInstanceOf[OutOfMemoryError] =>
- // SPARK-24294: This is defensive code, in case that
SparkFatalException is
- // misused and uncaught.
- System.exit(SparkExitCode.OOM)
- case _: KilledByTaskReaperException if exitOnUncaughtException =>
- System.exit(ExecutorExitCode.KILLED_BY_TASK_REAPER)
- case _ if exitOnUncaughtException =>
- System.exit(SparkExitCode.UNCAUGHT_EXCEPTION)
- case _ =>
- // SPARK-30310: Don't System.exit() when exitOnUncaughtException
is false
+ // Traverse the causes up to killOnFatalErrorDepth layers
Review Comment:
> In Executor.scala, an exception is considered fatal if any exception in
the chain (or its causes) is fatal.
Can we reuse the code there? Like adding a function `def findRootCauseError`
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]