Github user jose-torres commented on a diff in the pull request:
https://github.com/apache/spark/pull/20602#discussion_r167994087
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/StreamExecution.scala
---
@@ -369,7 +370,11 @@ abstract class StreamExecution(
// exception
// UncheckedExecutionException - thrown by codes that cannot throw
a checked
// ExecutionException, such as
BiFunction.apply
- case e2 @ (_: UncheckedIOException | _: ExecutionException | _:
UncheckedExecutionException)
+ // SparkException - thrown if the interrupt happens in the middle
of an RPC wait
+ case e2 @ (_: UncheckedIOException |
+ _: ExecutionException |
+ _: UncheckedExecutionException |
+ _: SparkException)
--- End diff --
I agree with the worry, but I'm not sure I see a better solution.
Other alternatives I can think of are matching against the specific
exception message string, or changing ThreadUtils.awaitResult() to throw a
custom exception. Do you have any thoughts?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]