Thanks raymond.

I duplicated the question. Please see the reply here. [?]


2014-09-04 14:27 GMT+08:00 牛兆捷 <nzjem...@gmail.com>:

> But is it possible to make t resizable? When we don't have many RDD to
> cache, we can give some memory to others.
>
>
> 2014-09-04 13:45 GMT+08:00 Patrick Wendell <pwend...@gmail.com>:
>
> Changing this is not supported, it si immutable similar to other spark
>> configuration settings.
>>
>> On Wed, Sep 3, 2014 at 8:13 PM, 牛兆捷 <nzjem...@gmail.com> wrote:
>> > Dear all:
>> >
>> > Spark uses memory to cache RDD and the memory size is specified by
>> > "spark.storage.memoryFraction".
>> >
>> > One the Executor starts, does Spark support adjusting/resizing memory
>> size
>> > of this part dynamically?
>> >
>> > Thanks.
>> >
>> > --
>> > *Regards,*
>> > *Zhaojie*
>>
>
>
>
> --
> *Regards,*
> *Zhaojie*
>
>


-- 
*Regards,*
*Zhaojie*

Reply via email to