[
https://issues.apache.org/jira/browse/SPARK-8333?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15059589#comment-15059589
]
Michael Han commented on SPARK-8333:
------------------------------------
Hello,
I encounter this issue today when I tried the example of reading json format
data
http://spark.apache.org/docs/latest/sql-programming-guide.html#json-datasets
I run it in win7 64
Hope in next release of Spark this issue could be fixed.
15/12/16 14:35:44 ERROR DiskBlockManager: Exception while deleting local spark
dir:
C:\Users\mh6\AppData\Local\Temp\blockmgr-4b1ec88a-5ec8-41e2-add2-7b8fbc2f0b65
java.io.IOException: Failed to delete:
C:\Users\mh6\AppData\Local\Temp\blockmgr-4b1ec88a-5ec8-41e2-add2-7b8fbc2f0b65
at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:884)
...
> Spark failed to delete temp directory created by HiveContext
> ------------------------------------------------------------
>
> Key: SPARK-8333
> URL: https://issues.apache.org/jira/browse/SPARK-8333
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.4.0
> Environment: Windows7 64bit
> Reporter: sheng
> Priority: Minor
> Labels: Hive, metastore, sparksql
> Attachments: test.tar
>
>
> Spark 1.4.0 failed to stop SparkContext.
> {code:title=LocalHiveTest.scala|borderStyle=solid}
> val sc = new SparkContext("local", "local-hive-test", new SparkConf())
> val hc = Utils.createHiveContext(sc)
> ... // execute some HiveQL statements
> sc.stop()
> {code}
> sc.stop() failed to execute, it threw the following exception:
> {quote}
> 15/06/13 03:19:06 INFO Utils: Shutdown hook called
> 15/06/13 03:19:06 INFO Utils: Deleting directory
> C:\Users\moshangcheng\AppData\Local\Temp\spark-d6d3c30e-512e-4693-a436-485e2af4baea
> 15/06/13 03:19:06 ERROR Utils: Exception while deleting Spark temp dir:
> C:\Users\moshangcheng\AppData\Local\Temp\spark-d6d3c30e-512e-4693-a436-485e2af4baea
> java.io.IOException: Failed to delete:
> C:\Users\moshangcheng\AppData\Local\Temp\spark-d6d3c30e-512e-4693-a436-485e2af4baea
> at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:963)
> at
> org.apache.spark.util.Utils$$anonfun$1$$anonfun$apply$mcV$sp$5.apply(Utils.scala:204)
> at
> org.apache.spark.util.Utils$$anonfun$1$$anonfun$apply$mcV$sp$5.apply(Utils.scala:201)
> at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
> at org.apache.spark.util.Utils$$anonfun$1.apply$mcV$sp(Utils.scala:201)
> at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2292)
> at
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(Utils.scala:2262)
> at
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(Utils.scala:2262)
> at
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(Utils.scala:2262)
> at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1772)
> at
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2262)
> at
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2262)
> at
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2262)
> at scala.util.Try$.apply(Try.scala:161)
> at
> org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2262)
> at
> org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2244)
> at
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
> {quote}
> It seems this bug is introduced by this SPARK-6907. In SPARK-6907, a local
> hive metastore is created in a temp directory. The problem is the local hive
> metastore is not shut down correctly. At the end of application, if
> SparkContext.stop() is called, it tries to delete the temp directory which is
> still used by the local hive metastore, and throws an exception.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]