Github user mridulm commented on a diff in the pull request:
https://github.com/apache/spark/pull/16639#discussion_r96814693
--- Diff:
core/src/main/scala/org/apache/spark/shuffle/FetchFailedException.scala ---
@@ -45,6 +45,12 @@ private[spark] class FetchFailedException(
this(bmAddress, shuffleId, mapId, reduceId, cause.getMessage, cause)
}
+ // SPARK-19267. We set the fetch failure in the task context, so that
even if there is user-code
+ // which intercepts this exception (possibly wrapping it), the Executor
can still tell there was
+ // a fetch failure, and send the correct error msg back to the driver.
The TaskContext won't be
+ // defined if this is run on the driver (just in test cases) -- we can
safely ignore then.
+ Option(TaskContext.get()).map(_.setFetchFailed(this))
--- End diff --
Since creation of an Exception does not necessarily mean it should get
thrown - we must explicitly add this expectation to the documentation/contract
of FetchFailedException constructor - indicating that we expect it to be
created only for it to be thrown immediately.
This should be fine since FetchFailedException is private[spark] right now.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]