Github user CodEnFisH commented on the pull request:
https://github.com/apache/spark/pull/3515#issuecomment-65181037
I reproduced the failed testing locally and took a look at the log.
The failed test case ("awaitTermination with error in task") is to check if
task failure is successfully captured by the system.
But it seems that DAGScheduler doesn't fail the job although its task
fails. The reason is that the job does NOT depend on the failed task. In my
log, I saw "Ignoring failure of Stage 0 because all jobs depending on it are
done" which is printed at the end of abortStage() of DAGScheduler. So the job
is not aborted and the exception is not captured.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]