Github user dongjoon-hyun commented on a diff in the pull request:
https://github.com/apache/spark/pull/21206#discussion_r185296305
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/execution/vectorized/ColumnarBatchSuite.scala
---
@@ -1333,4 +1334,19 @@ class ColumnarBatchSuite extends SparkFunSuite {
column.close()
}
+
+ testVector("WritableColumnVector.reserve(): requested capacity is too
large", 1024, ByteType) {
+ column =>
+ val capacity = Integer.MAX_VALUE - 1
--- End diff --
Actually, we have an existing test coverage for this in this test suite;
[test("exceeding maximum capacity should throw an
error")](https://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/execution/vectorized/ColumnarBatchSuite.scala#L1209-L1226).
Can we remove this test case from this PR?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]