Kai Zeng created SPARK-5869:
-------------------------------

             Summary: Exception when deleting Spark local dirs when shutting 
down DiskBlockManager
                 Key: SPARK-5869
                 URL: https://issues.apache.org/jira/browse/SPARK-5869
             Project: Spark
          Issue Type: Bug
    Affects Versions: 1.3.0
            Reporter: Kai Zeng


Running Spark on a ec2 cluster deployed using spark-ec2 scripts.
Got this error when applications are finishing:

15/02/17 19:23:44 ERROR util.Utils: Uncaught exception in thread delete Spark 
local dirs
java.lang.IllegalStateException: Shutdown in progress
        at 
java.lang.ApplicationShutdownHooks.remove(ApplicationShutdownHooks.java:82)
        at java.lang.Runtime.removeShutdownHook(Runtime.java:239)
        at 
org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
        at 
org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply$mcV$sp(DiskBlockManager.scala:141)
        at 
org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:139)
        at 
org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:139)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1613)
        at 
org.apache.spark.storage.DiskBlockManager$$anon$1.run(DiskBlockManager.scala:139)
Exception in thread "delete Spark local dirs" java.lang.IllegalStateException: 
Shutdown in progress
        at 
java.lang.ApplicationShutdownHooks.remove(ApplicationShutdownHooks.java:82)
        at java.lang.Runtime.removeShutdownHook(Runtime.java:239)
        at 
org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:151)
        at 
org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply$mcV$sp(DiskBlockManager.scala:141)
        at 
org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:139)
        at 
org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:139)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1613)
        at 
org.apache.spark.storage.DiskBlockManager$$anon$1.run(DiskBlockManager.scala:139)

It seems to have this issue after commit "SPARK-5841: remove DiskBlockManager 
shutdown hook on stop"



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to