However, in my application there is no logic to access local files.
So I thought that spark is internally using the local file system to cache
RDDs.

As per the log, it looks that error occurred during spark internal logic
rather than my business logic.
It is trying to delete local directories.
and they looks directories for cache.

/hadoop02/hadoop/yarn/local/*usercache/online/appcache*/application_1410795082830_3994/spark-local-20140916215842-6fe7

Are there any way not to use cache or local directories?
or way to access the directories created by "yarn" user via "online", my
spark user?

Thanks
Regards
Dongkyoung.




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Permission-denied-on-local-dir-tp14422p14453.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to