Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/21342#discussion_r189603209
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/exchange/BroadcastExchangeExec.scala
---
@@ -111,12 +112,18 @@ case class BroadcastExchangeExec(
SQLMetrics.postDriverMetricUpdates(sparkContext, executionId,
metrics.values.toSeq)
broadcasted
} catch {
+ // SPARK-24294: To bypass scala bug:
https://github.com/scala/bug/issues/9554, we throw
+ // SparkFatalException, which is a subclass of Exception.
ThreadUtils.awaitResult
+ // will catch this exception and re-throw the wrapped fatal
throwable.
case oe: OutOfMemoryError =>
--- End diff --
not related to this PR, but I'm a little worried about catching OOM here.
Spark has `SparkOutOfMemoryError`, and it seems more reasonable to catch
`SparkOutOfMemoryError`. This can be fixed in another PR.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]