mgaido91 commented on a change in pull request #23587: [SPARK-26664][SQL] Make
DecimalType's minimum adjusted scale configurable
URL: https://github.com/apache/spark/pull/23587#discussion_r249215661
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -1347,6 +1347,15 @@ object SQLConf {
.booleanConf
.createWithDefault(true)
+ val DECIMAL_OPERATIONS_MINIMUM_ADJUSTED_SCALE =
+ buildConf("spark.sql.decimalOperations.minimumAdjustedScale")
+ .internal()
+ .doc("Decimal operations' minimum adjusted scale when " +
Review comment:
I am not sure which is the problem experienced for which this PR was opened.
I think this value can be made a config, as there may be specific use cases
where this value may be tuned properly, despite I can't think of specific ones.
> it does not resolve the root issue.
What do you mean by "the root issue"? AFAIK, there is only one main issue
with decimal operations rigth now, which is a division when a decimal with
negative scale is involved. For this problem, #22450 is waiting for reviews,
after positive feedback on that approach in the discussion in the mailing list.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]