[
https://issues.apache.org/jira/browse/SPARK-12265?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15051190#comment-15051190
]
Neelesh Srinivas Salian commented on SPARK-12265:
-------------------------------------------------
[~srowen] [~dragos]
Can `SparkException` be a good wrapper for all such cases?
Also, I know Yarn has YarnRuntimeException which is a wrapper in all cases that
throw Runtime. Would it be a good suggestion to have such an implementation in
Spark (unless already present)?
> Spark calls System.exit inside driver instead of throwing exception
> -------------------------------------------------------------------
>
> Key: SPARK-12265
> URL: https://issues.apache.org/jira/browse/SPARK-12265
> Project: Spark
> Issue Type: Bug
> Components: Mesos
> Affects Versions: 1.6.0
> Reporter: Iulian Dragos
>
> Spark may call {{System.exit}} if Mesos sends an error code back to the
> MesosSchedulerDriver. This makes Spark very hard to test, since this
> effectively kills the driver application under test. Such tests may run under
> ScalaTest, that doesn't get a chance to collect a result and populate a
> report.
> Relevant code is in MesosSchedulerUtils.scala:
> {code}
> val ret = mesosDriver.run()
> logInfo("driver.run() returned with code " + ret)
> if (ret != null && ret.equals(Status.DRIVER_ABORTED)) {
> System.exit(1)
> }
> {code}
> Errors should be signaled with a {{SparkException}} in the correct thread.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]