Github user squito commented on the pull request:
https://github.com/apache/spark/pull/7028#issuecomment-119291909
Hi @aarondav , sorry I had missed the part about breaking the api. It
looks to me like the only place `JobWaiter.jobFailed` is called with a
non-SparkException is when there is an [error submitting the
job](https://github.com/apache/spark/blob/70beb808e13f6371968ac87f7cf625ed110375e6/core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala#L794).
This seems pretty minor -- would it technically be breaking the api contract
to change to also be a SparkException?
Or do you mean we break the api by just adding another wrapping
SparkException, so the cause & msg are different? That is true ... but the
cause & msg were pretty useless for programmatic use before, so as long as
we're not violating some hard rule on compatibility, I also don't see that
change as being a problem.
all this said, I do still think what you've proposed is an improvement ...
just want to make sure we've explored alternatives.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]