Hello,

What are my options to balance resources between multiple applications
running against a Spark cluster?

I am using the standalone cluster [1] setup on my local machine, and
starting a single application uses all the available cores. As long as that
first application is running, no other application does any processing.

I tried to run more workers using less cores with SPARK_WORKER_CORES, but
the single application still takes everything (see
https://dl.dropboxusercontent.com/u/1529870/spark%20-%20multiple%20applications.png).

Is there any strategy to reallocate resources based on number of
applications running against the cluster, or is the design mostly geared
towards having a single application running at a time?

Thank you,
TTimo

[1] http://spark.incubator.apache.org/docs/latest/spark-standalone.html

Reply via email to