Github user tgravescs commented on the issue:
https://github.com/apache/spark/pull/14916
We shouldn't really clean previous apps because there is a debug option to
keep staging dir around. ie you might want some of those around if debugging.
you would have to include some time based param also, but I really don't
like this as its putting burden on a different application and startup cost
could be affected, etc.
This really needs to be solved in yarn where they have like a cleanup task
that would run afterwards. There is a jira for this in YARN but no one has
worked on it yet.
Short of that I would say we add a spark interface to kill the application
and then we can nicely go down and cleanup, vs yarn kill just shooting the app.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]