Hi,
   I run  spark on a cluster with 20 machine, but when I start an
application use the spark-shell, there only 4 machine is working , the
other with just idle, without memery and cpu used, I watch this through
webui.

   I wonder the other machine maybe  busy, so i watch the machines using
 "top" and "free" command, but this is not。

  * So I just wonder why not spark assignment work to all all the 20
machine? this is not a good resource usage.*

Reply via email to