bozhang2820 commented on code in PR #46883:
URL: https://github.com/apache/spark/pull/46883#discussion_r1630446351
##########
core/src/main/scala/org/apache/spark/util/SparkExitCode.scala:
##########
@@ -45,6 +45,10 @@ private[spark] object SparkExitCode {
OutOfMemoryError. */
val OOM = 52
+ /** The default uncaught exception handler was reached and the exception was
thrown by
+ TaskReaper. */
+ val KILLED_BY_TASK_REAPER = 53
Review Comment:
Thanks for catching this! Will move KILLED_BY_TASK_REAPER to
ExecutorExitCode and change the value to 57.
##########
core/src/test/scala/org/apache/spark/JobCancellationSuite.scala:
##########
@@ -455,6 +464,8 @@ class JobCancellationSuite extends SparkFunSuite with
Matchers with BeforeAndAft
sc.cancelJobGroup("jobA")
val e = intercept[SparkException] { ThreadUtils.awaitResult(jobA,
15.seconds) }.getCause
assert(e.getMessage contains "cancel")
+ semExec.acquire(2)
+ assert(execLossReason == Seq("Command exited with code 53", "Command
exited with code 53"))
Review Comment:
Sure. Will do.
##########
core/src/main/scala/org/apache/spark/executor/Executor.scala:
##########
@@ -1310,3 +1309,5 @@ private[spark] object Executor {
}
}
}
+
+class KilledByTaskReaperException(message: String) extends
SparkException(message)
Review Comment:
I did not use error class since this is not user-facing, and we've been
using class matching in SparkUncaughtExceptionHandler.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]