Github user jinxing64 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21424#discussion_r191107018
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/joins/HashedRelation.scala
 ---
    @@ -106,11 +108,20 @@ private[execution] object HashedRelation {
               1),
             0)
         }
    -
    -    if (key.length == 1 && key.head.dataType == LongType) {
    -      LongHashedRelation(input, key, sizeEstimate, mm)
    -    } else {
    -      UnsafeHashedRelation(input, key, sizeEstimate, mm)
    +    try {
    +      if (key.length == 1 && key.head.dataType == LongType) {
    +        LongHashedRelation(input, key, sizeEstimate, mm)
    +      } else {
    +        UnsafeHashedRelation(input, key, sizeEstimate, mm)
    +      }
    +    } catch {
    +      case oe: SparkOutOfMemoryError =>
    +        throw new SparkOutOfMemoryError(s"If this SparkOutOfMemoryError 
happens in Spark driver," +
    --- End diff --
    
    So, I change back ?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to