Hi,
I must run spark cluster under standalone mode.I want to know Spark will
support capacity scheduler in standalone mode as a choice?
Regards
Conner
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com
Hi,
I use spark cluster to run ETL jobs and analysis computation about the data
after elt stage.
The elt jobs can keep running for several hours, but analysis computation is
a short-running job which can finish in a few seconds.
The dilemma I entrapped is that my application runs in a single JVM