Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20529#discussion_r166857279
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/WriteToDataSourceV2.scala
---
@@ -92,12 +92,14 @@ case class WriteToDataSourceV2Exec(writer:
DataSourceWriter, query: SparkPlan) e
logInfo(s"Data source writer $writer committed.")
}
} catch {
- case _: InterruptedException if writer.isInstanceOf[StreamWriter] =>
- // Interruption is how continuous queries are ended, so accept and
ignore the exception.
+ case _: SparkException if writer.isInstanceOf[StreamWriter] =>
--- End diff --
I agree with @srowen that `SparkException` swallows to much. Also you both
fixed here and the below lines, not sure which your intentional fix.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]