eyalzit commented on a change in pull request #24382: [SPARK-27330][SS] support 
task abort in foreach writer
URL: https://github.com/apache/spark/pull/24382#discussion_r277605956
 
 

 ##########
 File path: docs/structured-streaming-programming-guide.md
 ##########
 @@ -2145,6 +2145,10 @@ streamingDatasetOfString.writeStream.foreach(
     def close(errorOrNull: Throwable): Unit = {
       // Close the connection
     }
+
+    def abort(): Unit = {
+      // Close the connection, this method is optional
 
 Review comment:
   take a look at the current ForeachDataWriter implementation, when processing 
fails, the exception is consumed by the close method and then thrown again in 
order to notify the stream it has failed. due to that the abort is being called 
so if i'll call the close again in the abort, it will be called again, now with 
a generic exception

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to