Hi,

when I run some spark application on my local machine using spark-submit:
$SPARK_HOME/bin/spark-submit --driver-memory 1g <class> <jar>
When I want to interrupt computing by ctrl-c it interrupt current stage but
later it waits and exit after around 5min and sometimes doesn't exit at all,
and the only way that I was able to kill it was "kill -9 <pid>", but after
this
my system doesn't want to boot correctly after reboot.

Is there a better way to properly kill Spark application?

Thanks,
Grzegorz

Reply via email to