This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 6211f7be5913 [SPARK-54259][CORE] Downgrade "Shutdown hook called" 
messages from INFO to DEBUG
6211f7be5913 is described below

commit 6211f7be5913e0f36330ff028bc651a3889949c1
Author: Sandy Ryza <[email protected]>
AuthorDate: Sun Nov 9 23:15:59 2025 -0800

    [SPARK-54259][CORE] Downgrade "Shutdown hook called" messages from INFO to 
DEBUG
    
    ### What changes were proposed in this pull request?
    
    Downgrades messages like these from INFO to DEBUG:
    
    ```
    25/11/09 06:18:58 INFO ShutdownHookManager: Shutdown hook called
    25/11/09 06:18:58 INFO ShutdownHookManager: Deleting directory 
/private/var/folders/1v/dqhbgmt10vl6v3tdlwvvx90r0000gp/T/spark-23f1af4f-eba9-47d1-a284-d7af223c59fb
    ```
    
    ### Why are the changes needed?
    
    Shutdown hook invocation messages are irrelevant to most users but show up 
on every invocation of `spark-submit`.
    
    ### Does this PR introduce _any_ user-facing change?
    
    The messages shown above will no longer show up if minimum log severity is 
set to INFO.
    
    ### How was this patch tested?
    
    Ran spark-submit and spark-pipelines.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    Closes #52955 from sryza/shutdown-hook.
    
    Authored-by: Sandy Ryza <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git 
a/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala 
b/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala
index af93f781343d..6a6b0299a0bc 100644
--- a/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala
+++ b/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala
@@ -61,12 +61,12 @@ private[spark] object ShutdownHookManager extends Logging {
   // Add a shutdown hook to delete the temp dirs when the JVM exits
   logDebug("Adding shutdown hook") // force eager creation of logger
   addShutdownHook(TEMP_DIR_SHUTDOWN_PRIORITY) { () =>
-    logInfo("Shutdown hook called")
+    logDebug("Shutdown hook called")
     // we need to materialize the paths to delete because deleteRecursively 
removes items from
     // shutdownDeletePaths as we are traversing through it.
     shutdownDeletePaths.toArray.foreach { dirPath =>
       try {
-        logInfo(log"Deleting directory ${MDC(LogKeys.PATH, dirPath)}")
+        logDebug(log"Deleting directory ${MDC(LogKeys.PATH, dirPath)}")
         Utils.deleteRecursively(new File(dirPath))
       } catch {
         case e: Exception =>


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to