Or should I shutdown the streaming context gracefully and then start it
again with a different number of executors?

On Sat, May 23, 2015 at 4:00 AM, Saiph Kappa <saiph.ka...@gmail.com> wrote:

> Sorry, but I can't see on TD's comments how to allocate executors on
> demand. It seems to me that he's talking about resources within an
> executor, mapping shards to cores. I want to be able to decommission
> executors/workers/machines.
>
> On Sat, May 23, 2015 at 3:31 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>
>> For #1, the answer is yes.
>>
>> For #2, See TD's comments on SPARK-7661
>>
>> Cheers
>>
>>
>> On Fri, May 22, 2015 at 6:58 PM, Saiph Kappa <saiph.ka...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> 1. Dynamic allocation is currently only supported with YARN, correct?
>>>
>>> 2. In spark streaming, it is possible to change the number of executors
>>> while an application is running? If so, can the allocation be controlled by
>>> the application, instead of using any already defined automatic policy?
>>> That is, I want to be able to get more executors or decommission executors
>>> on demand. Is there some way to achieve this?
>>>
>>> Thanks.
>>>
>>
>>
>

Reply via email to