Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/18754#discussion_r158620106
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/arrow/ArrowWriter.scala
---
@@ -214,6 +216,22 @@ private[arrow] class DoubleWriter(val valueVector:
Float8Vector) extends ArrowFi
}
}
+private[arrow] class DecimalWriter(
+ val valueVector: DecimalVector,
+ precision: Int,
+ scale: Int) extends ArrowFieldWriter {
+
+ override def setNull(): Unit = {
+ valueVector.setNull(count)
+ }
+
+ override def setValue(input: SpecializedGetters, ordinal: Int): Unit = {
+ val decimal = input.getDecimal(ordinal, precision, scale)
+ decimal.changePrecision(precision, scale)
--- End diff --
Unfortunately, it depends on the implementation of `getDecimal` for now.
Btw, I guess we need to check the return value of `changePrecision()` and
set `null` if the value is `false`, which means overflow.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]