rezasafi commented on issue #24142: [SPARK-27194][core] Job failures when task attempts do not clean up spark-staging parquet files URL: https://github.com/apache/spark/pull/24142#issuecomment-474386789 I am just worried that this fix causes duplicates. Isn't it better to call datawriter.abort() before raising the exception here: https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/FileFormatWriter.scala#L257 changing it to ``` case t: Throwable => datawriter.abort() throw new SparkException("Task failed while writing rows.", t) ```
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
