Hello, I have noticed that in https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala spark would call System.exit if an uncaught exception was encountered. I have an application that is running spark in local mode, and would like to avoid exiting the application if that happens. Will spark exit my application in local mode too, or is that the behavior only in cluster mode? Is there a setting to override this behavior? thanks, Yael