[ https://issues.apache.org/jira/browse/SPARK-14454?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Davies Liu updated SPARK-14454: ------------------------------- Fix Version/s: 1.6.2 > Better exception handling while marking tasks as failed > ------------------------------------------------------- > > Key: SPARK-14454 > URL: https://issues.apache.org/jira/browse/SPARK-14454 > Project: Spark > Issue Type: Bug > Components: Spark Core > Reporter: Sameer Agarwal > Fix For: 1.6.2, 2.0.0 > > > Add support for better handling of exceptions inside catch blocks if the code > within the block throws an exception. For instance here is the code in a > catch block before this change in WriterContainer.scala: > {code} > logError("Aborting task.", cause) > // call failure callbacks first, so we could have a chance to cleanup the > writer. > TaskContext.get().asInstanceOf[TaskContextImpl].markTaskFailed(cause) > if (currentWriter != null) { > currentWriter.close() > } > abortTask() > throw new SparkException("Task failed while writing rows.", cause) > {code} > If markTaskFailed or currentWriter.close throws an exception, we currently > lose the original cause. This PR fixes this problem by implementing a utility > function Utils.tryWithSafeCatch that suppresses (Throwable.addSuppressed) the > exception that are thrown within the catch block and rethrowing the original > exception. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org