rdblue commented on a change in pull request #24382: [SPARK-27330][SS] support 
task abort in foreach writer
URL: https://github.com/apache/spark/pull/24382#discussion_r309521630
 
 

 ##########
 File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/ForeachWriterTable.scala
 ##########
 @@ -141,17 +143,33 @@ class ForeachDataWriter[T](
       writer.process(rowConverter(record))
     } catch {
       case t: Throwable =>
-        writer.close(t)
+        closeWriter(t)
         throw t
     }
   }
 
   override def commit(): WriterCommitMessage = {
-    writer.close(null)
+    closeWriter(null)
     ForeachWriterCommitMessage
   }
 
-  override def abort(): Unit = {}
+  override def abort(): Unit = {
 
 Review comment:
   I can see that the exception is passed to close. My question is: what does 
the writer do differently based on the exception? If this is to satisfy an API 
and you can pass any exception, then I don't think it matters. There's effort 
required to get the exception to pass through and without a reasonable use case 
I'm wondering why it is necessary to do it.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to