[
https://issues.apache.org/jira/browse/SPARK-12216?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15994192#comment-15994192
]
Supriya Pasham edited comment on SPARK-12216 at 5/3/17 2:46 AM:
Hi Team,
I am executing 'spark-submit' with a jar and properties file in the below
manner
-> spark-submit --class package.classname --master local[*]
\Spark.jar data.properties
When i run the above command, immediately 2-3 exceptions are displayed in the
command prompt with below exception details.
I have seen that this is issue is marked as resolved, but i dint fin correct
resolution.
Please let me know if there is a solution to this issue -
ERROR ShutdownHookManager: Exception while deleting Spark temp dir:
C:\Users\user1\AppData\Local\Temp\spark-5e37d680-2e9f-4aed-ac59-2f24d8387
855
java.io.IOException: Failed to delete:
C:\Users\user1\AppData\Local\Temp\spark-5e37d680-2e9f-4aed-ac59-2f24d8387855
Environment details : I am running the commands in Windows 7 machine
Request you to provide a solution asap.
was (Author: supriya):
Hi Team,
I am executing 'spark-submit' with a jar and properties file in the below
manner
-> spark-submit --class package.classname --master local[*]
\Spark.jar data.properties
When i run the above command, immediately 2-3 exceptions are displayed in the
command prompt with below exception details.
I have seen that this is issue is marked as resolved, but i dint fin correct
resolution.
Please let me know if there is a solution to this issue -
ERROR ShutdownHookManager: Exception while deleting Spark temp dir:
C:\Users\user1\AppData\Local\Temp\spark-5e37d680-2e9f-4aed-ac59-2f24d8387
855
java.io.IOException: Failed to delete:
C:\Users\user1\AppData\Local\Temp\spark-5e37d680-2e9f-4aed-ac59-2f24d8387855
Request you to provide a solution asap.
> Spark failed to delete temp directory
> --
>
> Key: SPARK-12216
> URL: https://issues.apache.org/jira/browse/SPARK-12216
> Project: Spark
> Issue Type: Bug
> Components: Spark Shell
> Environment: windows 7 64 bit
> Spark 1.52
> Java 1.8.0.65
> PATH includes:
> C:\Users\Stefan\spark-1.5.2-bin-hadoop2.6\bin
> C:\ProgramData\Oracle\Java\javapath
> C:\Users\Stefan\scala\bin
> SYSTEM variables set are:
> JAVA_HOME=C:\Program Files\Java\jre1.8.0_65
> HADOOP_HOME=C:\Users\Stefan\hadoop-2.6.0\bin
> (where the bin\winutils resides)
> both \tmp and \tmp\hive have permissions
> drwxrwxrwx as detected by winutils ls
>Reporter: stefan
>Priority: Minor
>
> The mailing list archives have no obvious solution to this:
> scala> :q
> Stopping spark context.
> 15/12/08 16:24:22 ERROR ShutdownHookManager: Exception while deleting Spark
> temp dir:
> C:\Users\Stefan\AppData\Local\Temp\spark-18f2a418-e02f-458b-8325-60642868fdff
> java.io.IOException: Failed to delete:
> C:\Users\Stefan\AppData\Local\Temp\spark-18f2a418-e02f-458b-8325-60642868fdff
> at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:884)
> at
> org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:63)
> at
> org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:60)
> at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
> at
> org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:60)
> at
> org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:264)
> at
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:234)
> at
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:234)
> at
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:234)
> at
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
> at
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:234)
> at
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:234)
> at
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:234)
> at scala.util.Try$.apply(Try.scala:161)
> at
> org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:234)
> at
> org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:216)
> at
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)