Hi -

Does anybody have any ideas how to dynamically allocate cores instead of 
statically partitioning them among multiple applications? Thanks.

Mohammed

From: Mohammed Guller
Sent: Friday, December 5, 2014 11:26 PM
To: user@spark.apache.org
Subject: Fair scheduling accross applications in stand-alone mode

Hi -

I understand that one can use "spark.deploy.defaultCores" and "spark.cores.max" 
to assign a fixed number of worker cores to different apps. However, instead of 
statically assigning the cores, I would like Spark to dynamically assign the 
cores to multiple apps. For example, when there is a single app running, that 
app gets the entire cluster resources, but when other apps are submitted, 
resources that are free get assigned to the new apps.

Is there any configuration setting to achieve this in stand-alone mode?

Mohammed

Reply via email to