Github user cxzl25 commented on a diff in the pull request:
https://github.com/apache/spark/pull/21311#discussion_r190345942
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/execution/joins/HashedRelationSuite.scala
---
@@ -254,6 +254,30 @@ class HashedRelationSuite extends SparkFunSuite with
SharedSQLContext {
map.free()
}
+ test("LongToUnsafeRowMap with big values") {
+ val taskMemoryManager = new TaskMemoryManager(
+ new StaticMemoryManager(
+ new SparkConf().set(MEMORY_OFFHEAP_ENABLED.key, "false"),
+ Long.MaxValue,
+ Long.MaxValue,
+ 1),
+ 0)
+ val unsafeProj = UnsafeProjection.create(Array[DataType](StringType))
+ val map = new LongToUnsafeRowMap(taskMemoryManager, 1)
+
+ val key = 0L
+ // the page array is initialized with length 1 << 17 (1M bytes),
+ // so here we need a value larger than 1 << 18 (2M bytes),to trigger
the bug
+ val bigStr = UTF8String.fromString("x" * (1 << 22))
--- End diff --
Yes. I think so.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]