Sameer Agarwal created SPARK-14454:
--------------------------------------
Summary: Better exception handling while marking tasks as failed
Key: SPARK-14454
URL: https://issues.apache.org/jira/browse/SPARK-14454
Project: Spark
Issue Type: Bug
Components: Spark Core
Reporter: Sameer Agarwal
Add support for better handling of exceptions inside catch blocks if the code
within the block throws an exception. For instance here is the code in a catch
block before this change in WriterContainer.scala:
{code}
logError("Aborting task.", cause)
// call failure callbacks first, so we could have a chance to cleanup the
writer.
TaskContext.get().asInstanceOf[TaskContextImpl].markTaskFailed(cause)
if (currentWriter != null) {
currentWriter.close()
}
abortTask()
throw new SparkException("Task failed while writing rows.", cause)
{code}
If markTaskFailed or currentWriter.close throws an exception, we currently lose
the original cause. This PR fixes this problem by implementing a utility
function Utils.tryWithSafeCatch that suppresses (Throwable.addSuppressed) the
exception that are thrown within the catch block and rethrowing the original
exception.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]