GitHub user sarutak opened a pull request:
https://github.com/apache/spark/pull/6026
[SPARK-7503][YARN] Resources in .sparkStaging directory can't be cleaned up
on error
When we run applications on YARN with cluster mode, uploaded resources on
.sparkStaging directory can't be cleaned up in case of failure of uploading
local resources.
You can see this issue by running following command.
```
bin/spark-submit --master yarn --deploy-mode cluster --class
<someClassName> <non-existing-jar>
```
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/sarutak/spark
delete-uploaded-resources-on-error
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/6026.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #6026
----
commit f61071b6fc63c976d1fc4dbee3473e4e76f12539
Author: Kousuke Saruta <[email protected]>
Date: 2015-05-09T10:05:30Z
Fixed cleanup problem
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]