alamb opened a new issue, #7301:
URL: https://github.com/apache/arrow-datafusion/issues/7301

   ### Is your feature request related to a problem or challenge?
   
   As described in detail by @liukun4515  and @tustvold  and @viirya  on 
https://github.com/apache/arrow-datafusion/pull/6832,  DataFusion's decimal 
devision semantics. 
   
   @liukun4515  notes 
https://github.com/apache/arrow-datafusion/pull/6832#issuecomment-1680098056 
that spark has the config to control the precision loss : 
https://github.com/apache/spark/blob/2be20e54a2222f6cdf64e8486d1910133b43665f/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala#L246
   
   And @tustvold notes For people looking to emulate spark which only supports 
precision up to 38, casting to Decimal256 and then truncating down to 
Decimal128 will be equivalent, and is what a precision loss arithmetic kernel 
would do
   
   ### Describe the solution you'd like
   
   If anyone needs spark compatible decimal division rules, I suggest:
   
   1. Add a new config option 
   2. Apply the rewrite suggested by @tustvold (cast to Decimal256, divide, and 
then cast to Decimal128) as an 
[AnalyzerRule](https://docs.rs/datafusion/latest/datafusion/optimizer/analyzer/trait.AnalyzerRule.html#)
   
   ### Describe alternatives you've considered
   
   See ticket -- we discussed at length changing the semantics of division in 
arrow-rs and concluded there was no one agreed upon ideal behavior
   
   ### Additional context
   
   _No response_


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to