Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/21499#discussion_r193238183
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/types/DecimalType.scala ---
@@ -161,13 +161,17 @@ object DecimalType extends AbstractDataType {
* This method is used only when
`spark.sql.decimalOperations.allowPrecisionLoss` is set to true.
*/
private[sql] def adjustPrecisionScale(precision: Int, scale: Int):
DecimalType = {
- // Assumptions:
+ // Assumption:
assert(precision >= scale)
- assert(scale >= 0)
if (precision <= MAX_PRECISION) {
// Adjustment only needed when we exceed max precision
DecimalType(precision, scale)
+ } else if (scale < 0) {
+ // Decimal can have negative scale (SPARK-24468). In this case, we
cannot allow a precision
+ // loss since we would cause a loss of digits in the integer part.
--- End diff --
is there a SQL standard for it? I feel it seems reasonable to truncate the
integral part, like `123456` -> `123000`
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]