Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/18754#discussion_r158566190
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/arrow/ArrowWriter.scala
---
@@ -214,6 +216,22 @@ private[arrow] class DoubleWriter(val valueVector:
Float8Vector) extends ArrowFi
}
}
+private[arrow] class DecimalWriter(
+ val valueVector: DecimalVector,
+ precision: Int,
+ scale: Int) extends ArrowFieldWriter {
+
+ override def setNull(): Unit = {
+ valueVector.setNull(count)
+ }
+
+ override def setValue(input: SpecializedGetters, ordinal: Int): Unit = {
+ val decimal = input.getDecimal(ordinal, precision, scale)
+ decimal.changePrecision(precision, scale)
--- End diff --
Is it necessary to call `changePrecision` even though `getDecimal` already
takes the precision/scale as input - is it not guaranteed to return a decimal
with that scale?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]