Re: imposed dynamic resource allocation

2015-12-18 Thread Andrew Or
Hi Antony,

The configuration to enable dynamic allocation is per-application.

If you only wish to enable this for one of your applications, just set
`spark.dynamicAllocation.enabled` to true for that application only. The
way it works under the hood is that application will start sending requests
to the AM asking for executors. If you did not enable this config, your
application will not make such requests.

-Andrew

2015-12-11 14:01 GMT-08:00 Antony Mayi :

> Hi,
>
> using spark 1.5.2 on yarn (client mode) and was trying to use the dynamic
> resource allocation but it seems once it is enabled by first app then any
> following application is managed that way even if explicitly disabling.
>
> example:
> 1) yarn configured with org.apache.spark.network.yarn.YarnShuffleService
> as spark_shuffle aux class
> 2) running first app that doesnt specify dynamic allocation / shuffle
> service - it runs as expected with static executors
> 3) running second application that enables spark.dynamicAllocation.enabled
> and spark.shuffle.service.enabled - it is dynamic as expected
> 4) running another app that doesnt enable and it even disables dynamic
> allocation / shuffle service still the executors are being added/removed
> dynamically throughout the runtime.
> 5) restarting nodemanagers to reset this
>
> Is this known issue or have I missed something? Can the dynamic resource
> allocation be enabled per application?
>
> Thanks,
> Antony.
>


imposed dynamic resource allocation

2015-12-11 Thread Antony Mayi
Hi,
using spark 1.5.2 on yarn (client mode) and was trying to use the dynamic 
resource allocation but it seems once it is enabled by first app then any 
following application is managed that way even if explicitly disabling.
example:1) yarn configured with 
org.apache.spark.network.yarn.YarnShuffleService as spark_shuffle aux class2) 
running first app that doesnt specify dynamic allocation / shuffle service - it 
runs as expected with static executors3) running second application that 
enables spark.dynamicAllocation.enabled and spark.shuffle.service.enabled - it 
is dynamic as expected4) running another app that doesnt enable and it even 
disables dynamic allocation / shuffle service still the executors are being 
added/removed dynamically throughout the runtime.5) restarting nodemanagers to 
reset this
Is this known issue or have I missed something? Can the dynamic resource 
allocation be enabled per application?
Thanks,Antony.