Repository: spark
Updated Branches:
  refs/heads/master 676bbb244 -> bfa3d32f7


[SPARK-26117][FOLLOW-UP][SQL] throw SparkOutOfMemoryError intead of 
SparkException in UnsafeHashedRelation

## What changes were proposed in this pull request?

When build hash Map with one row of data and run out of memory, we should throw 
a SparkOutOfMemoryError exception, which is more accurate than SparkException. 
this PR fix it.

## How was this patch tested?

N / A

Closes #23190 from heary-cao/throwUnsafeHashedRelation.

Authored-by: caoxuewen <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/bfa3d32f
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/bfa3d32f
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/bfa3d32f

Branch: refs/heads/master
Commit: bfa3d32f7719cd4bfb2c161fe4a6bd3eea148158
Parents: 676bbb2
Author: caoxuewen <[email protected]>
Authored: Mon Dec 3 16:18:22 2018 +0800
Committer: Wenchen Fan <[email protected]>
Committed: Mon Dec 3 16:18:22 2018 +0800

----------------------------------------------------------------------
 .../org/apache/spark/sql/execution/joins/HashedRelation.scala  | 6 ++++--
 1 file changed, 4 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/bfa3d32f/sql/core/src/main/scala/org/apache/spark/sql/execution/joins/HashedRelation.scala
----------------------------------------------------------------------
diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/joins/HashedRelation.scala
 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/joins/HashedRelation.scala
index 86eb47a..e8c01d4 100644
--- 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/joins/HashedRelation.scala
+++ 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/joins/HashedRelation.scala
@@ -24,7 +24,7 @@ import com.esotericsoftware.kryo.io.{Input, Output}
 
 import org.apache.spark.{SparkConf, SparkEnv, SparkException}
 import org.apache.spark.internal.config.MEMORY_OFFHEAP_ENABLED
-import org.apache.spark.memory.{MemoryConsumer, StaticMemoryManager, 
TaskMemoryManager}
+import org.apache.spark.memory._
 import org.apache.spark.sql.catalyst.InternalRow
 import org.apache.spark.sql.catalyst.expressions._
 import org.apache.spark.sql.catalyst.plans.physical.BroadcastMode
@@ -316,7 +316,9 @@ private[joins] object UnsafeHashedRelation {
           row.getBaseObject, row.getBaseOffset, row.getSizeInBytes)
         if (!success) {
           binaryMap.free()
-          throw new SparkException("There is no enough memory to build hash 
map")
+          // scalastyle:off throwerror
+          throw new SparkOutOfMemoryError("There is no enough memory to build 
hash map")
+          // scalastyle:on throwerror
         }
       }
     }


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to