pan3793 commented on code in PR #52091:
URL: https://github.com/apache/spark/pull/52091#discussion_r2297621738


##########
core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala:
##########
@@ -1037,6 +1045,12 @@ private[spark] class SparkSubmit extends Logging {
           case e: Throwable => logError("Failed to close SparkContext", e)
         }
       }
+      if (sparkConf.get(SUBMIT_CALL_SYSTEM_EXIT_ON_MAIN_EXIT)) {
+        logInfo(
+          log"Calling System.exit() with exit code ${MDC(LogKeys.EXIT_CODE, 
exitCode)} " +
+          log"because main ${MDC(LogKeys.CONFIG, 
SUBMIT_CALL_SYSTEM_EXIT_ON_MAIN_EXIT.key)}=true")
+        exitFn(exitCode)

Review Comment:
   @itskals I think it's more appropriate to implement this in a shutdown hook. 
You actually want a graceful shutdown mechanism that's not coupled to either 
daemon or non-daemon threads. If users want to do some cleanup work or graceful 
shutdown logic before the JVM terminates, they need to register a shutdown hook 
rather than creating non-daemon threads.
   
   BTW, you can use three backticks to quote the code block
   ```
   I'm a
     code block
   ```
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to