Hi -

I understand that one can use "spark.deploy.defaultCores" and "spark.cores.max" 
to assign a fixed number of worker cores to different apps. However, instead of 
statically assigning the cores, I would like Spark to dynamically assign the 
cores to multiple apps. For example, when there is a single app running, that 
app gets the entire cluster resources, but when other apps are submitted, 
resources that are free get assigned to the new apps.

Is there any configuration setting to achieve this in stand-alone mode?

Mohammed

Reply via email to