cloud-fan commented on issue #22450: [SPARK-25454][SQL] Avoid precision loss in 
division with decimal with negative scale
URL: https://github.com/apache/spark/pull/22450#issuecomment-448248886
 
 
   It's Spark 3.0 now, shall we consider forbidding negative scale? This is an 
undocumented and broken feature, and many databases don't support it either. I 
checked that parquet doesn't support negative scale, which means Spark can't 
support it well as Spark can't write it to data source like parquet.
   
   cc @dilipbiswal 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to