Default value for spark.worker.cleanup.enabled is false:

private val CLEANUP_ENABLED =
conf.getBoolean("spark.worker.cleanup.enabled", false)

I wonder if the default should be set as true.

Cheers

On Thu, May 7, 2015 at 6:19 AM, Todd Nist <tsind...@gmail.com> wrote:

> Have you tried to set the following?
>
> spark.worker.cleanup.enabled=true
> spark.worker.cleanup.appDataTtl=<seconds>”
>
>
>
> On Thu, May 7, 2015 at 2:39 AM, Taeyun Kim <taeyun....@innowireless.com>
> wrote:
>
>> Hi,
>>
>>
>>
>> After a spark program completes, there are 3 temporary directories remain
>> in the temp directory.
>>
>> The file names are like this: spark-2e389487-40cc-4a82-a5c7-353c0feefbb7
>>
>>
>>
>> And the Spark program runs on Windows, a snappy DLL file also remains in
>> the temp directory.
>>
>> The file name is like this:
>> snappy-1.0.4.1-6e117df4-97b6-4d69-bf9d-71c4a627940c-snappyjava
>>
>>
>>
>> They are created every time the Spark program runs. So the number of
>> files and directories keeps growing.
>>
>>
>>
>> How can let them be deleted?
>>
>>
>>
>> Spark version is 1.3.1 with Hadoop 2.6.
>>
>>
>>
>> Thanks.
>>
>>
>>
>>
>>
>
>

Reply via email to