Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/21912#discussion_r214253479
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ArrayData.scala
---
@@ -34,6 +36,32 @@ object ArrayData {
case a: Array[Double] => UnsafeArrayData.fromPrimitiveArray(a)
case other => new GenericArrayData(other)
}
+
+
+ /**
+ * Allocate [[UnsafeArrayData]] or [[GenericArrayData]] based on given
parameters.
+ *
+ * @param elementSize a size of an element in bytes
+ * @param numElements the number of elements the array should contain
+ * @param isPrimitiveType whether the type of an element is primitive
type
+ * @param additionalErrorMessage string to include in the error message
+ */
+ def allocateArrayData(
+ elementSize: Int,
--- End diff --
`elementSize` is only used when creating unsafe array. I think we just have
a `elementSize: Option[Int]` and remove the `isPrimitiveType` parameter.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]