I am running spark on shared yarn cluster.
My user ID is "online", but I found that when I run my spark application,
local directories are created by "yarn" user ID.
So I am unable to delete local directories and finally application failed.

Please refer to my log below:

14/09/16 21:59:02 ERROR DiskBlockManager: Exception while deleting local
spark dir:
/hadoop02/hadoop/yarn/local/usercache/online/appcache/application_1410795082830_3994/spark-local-20140916215842-6fe7
java.io.IOException: Failed to list files for dir:
/hadoop02/hadoop/yarn/local/usercache/online/appcache/application_1410795082830_3994/spark-local-20140916215842-6fe7/3a
        at org.apache.spark.util.Utils$.listFilesSafely(Utils.scala:580)
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:592)
        at
org.apache.spark.util.Utils$$anonfun$deleteRecursively$1.apply(Utils.scala:593)
        at
org.apache.spark.util.Utils$$anonfun$deleteRecursively$1.apply(Utils.scala:592)
        at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at
scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:592)
        at
org.apache.spark.storage.DiskBlockManager$$anonfun$stop$1.apply(DiskBlockManager.scala:163)
        at
org.apache.spark.storage.DiskBlockManager$$anonfun$stop$1.apply(DiskBlockManager.scala:160)
        at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at
scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
        at
org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:160)
        at
org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply$mcV$sp(DiskBlockManager.scala:153)
        at
org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:151)
        at
org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:151)
        at
org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
        at
org.apache.spark.storage.DiskBlockManager$$anon$1.run(DiskBlockManager.scala:151)


I am unable to access
"/hadoop02/hadoop/yarn/local/usercache/online/appcache/application_1410795082830_3994/spark-local-20140916215842-6fe7"
 
e.g) "ls
/hadoop02/hadoop/yarn/local/usercache/online/appcache/application_1410795082830_3994/spark-local-20140916215842-6fe7"
does not work and permission denied occurred.

I am using spark-1.0.0 and yarn 2.4.0.

Thanks in advance.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/permission-denied-on-local-dir-tp14422.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to