[
https://issues.apache.org/jira/browse/SPARK-2572?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14535557#comment-14535557
]
Prasanna Gautam commented on SPARK-2572:
----------------------------------------
This is still happening as of Spark-1.3.0 with pySpark, when the context is
closed the files aren't deleted. Neither does sc.clearFiles() seem to remove
the /tmp/spark-* directories.
> Can't delete local dir on executor automatically when running spark over
> Mesos.
> -------------------------------------------------------------------------------
>
> Key: SPARK-2572
> URL: https://issues.apache.org/jira/browse/SPARK-2572
> Project: Spark
> Issue Type: Bug
> Components: Mesos
> Affects Versions: 1.0.0
> Reporter: Yadong Qi
> Priority: Minor
>
> When running spark over Mesos in “fine-grained” modes or “coarse-grained”
> mode. After the application finished.The local
> dir(/tmp/spark-local-20140718114058-834c) on executor can't not delete
> automatically.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]