Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/12234#discussion_r58826581
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -1297,6 +1297,35 @@ private[spark] object Utils extends Logging {
}
}
+ /**
+ * Execute a block of code, then a catch block, but if exceptions happen
in
+ * the catch block, do not suppress the original exception.
+ *
+ * This is primarily an issue with `catch { out.close() }` blocks, where
+ * close needs to be called to clean up `out`, but if an exception
happened
+ * in `out.write`, it's likely `out` may be corrupted and `out.close`
will
+ * fail as well. This would then suppress the original/likely more
meaningful
+ * exception from the original `out.write` call.
+ */
+ def tryWithSafeCatch[T](block: => T)(catchBlock: => Unit): T = {
+ try {
+ block
+ } catch {
+ case cause: Throwable =>
+ // Purposefully not using NonFatal, because for even fatal
exceptions
+ // we don't want to have our catchBlock suppress
+ val originalThrowable = cause
+ try {
+ catchBlock
+ } catch {
+ case t: Throwable =>
+ logWarning(s"Suppressing exception in catch: " + t.getMessage,
t)
--- End diff --
so the thing is this exception doesn't make it to the driver. it would be
great if the error message that made it to the driver can contain the error for
both, and the original exception's cause. then users know there is another
exception that failed during close/callback, and they can go look up in the
executor for the full stacktrace
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]