tinhto-000 commented on a change in pull request #26955: [SPARK-30310] [Core] 
Resolve missing match case in SparkUncaughtExceptionHandler and added tests
URL: https://github.com/apache/spark/pull/26955#discussion_r363475923
 
 

 ##########
 File path: 
core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala
 ##########
 @@ -48,11 +48,17 @@ private[spark] class SparkUncaughtExceptionHandler(val 
exitOnUncaughtException:
             System.exit(SparkExitCode.OOM)
           case _ if exitOnUncaughtException =>
             System.exit(SparkExitCode.UNCAUGHT_EXCEPTION)
+          case _ =>
+            // SPARK-30310: Don't System.exit() when exitOnUncaughtException 
is false
         }
       }
     } catch {
-      case oom: OutOfMemoryError => Runtime.getRuntime.halt(SparkExitCode.OOM)
-      case t: Throwable => 
Runtime.getRuntime.halt(SparkExitCode.UNCAUGHT_EXCEPTION_TWICE)
+      case oom: OutOfMemoryError =>
+        logError(s"Uncaught OutOfMemoryError in thread $thread, process 
halted.", oom)
 
 Review comment:
   Thanks for the comment.  
   
   Well the reason why for the logError is because it wasn't obvious to users 
or devs why the worker would just disappeared as DEAD on the UI, and there was 
nothing in the worker log file to tell what happened.  We couldn't find out why 
until we set SPARK_NO_DAEMONIZE=1 and examined the exit code.
   
   Is there any alternative to indicate the process halted unexpectedly?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to