zsxwing commented on a change in pull request #24382: [SPARK-27330][SS] support
task abort in foreach writer
URL: https://github.com/apache/spark/pull/24382#discussion_r308917449
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/ForeachWriterTable.scala
##########
@@ -141,17 +143,33 @@ class ForeachDataWriter[T](
writer.process(rowConverter(record))
} catch {
case t: Throwable =>
- writer.close(t)
+ closeWriter(t)
throw t
}
}
override def commit(): WriterCommitMessage = {
- writer.close(null)
+ closeWriter(null)
ForeachWriterCommitMessage
}
- override def abort(): Unit = {}
+ override def abort(): Unit = {
+ closeWriter(new RuntimeException("Foreach writer has been aborted"))
Review comment:
I think it's better to change `DataWriter.abort` to pass the cause of
failure in rather than creating this `RuntimeException`. An implementation may
care about the error and do different stuff. @cloud-fan what do you think?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]