Hi,

I'm using Spark dynamic allocation on a standalone server with 1 Master(2
cores & 4Gb RAM ) and 1 Worker node(14 cores & 30Gb RAM). It works fine
with that setting however, when the number of workers are increased to 2
(7cores & 15Gb RAM each) via spark-env.sh (SPARK_WORKER_INSTANCES = 2,
etc..), then the spark UI shows no workers as running and 0 cores, 0 memory
being used.

Does spark support dynamic allocation work with more than one worker? If it
is indeed possible, can anyone let me know how to make it work?

Thanks,
Varun D.

Reply via email to