Hello, I have a Spark Standalone cluster running in HA mode. I launched a application using spark-submit with cluster and supervised mode enabled and it launched sucessfully on one of the worker nodes.
How can I stop/restart/kill or otherwise manage such task running in a standalone cluster? Seems there is no options in the web interface. I wonder how I can upgrade my driver in the future. Also, does supervised mode work across worker nodes? IE will it relaunch on another node if the current one dies or does it only handle restart on same node after driver crash? I would love to hear others experience with this :) Thanks! (PS i am launching a Spark Streaming application) // Jesper Lundgren