cloud-fan opened a new pull request #23388: [SPARK-26448][SQL] retain the 
difference between 0.0 and -0.0
URL: https://github.com/apache/spark/pull/23388
 
 
   ## What changes were proposed in this pull request?
   
   In https://github.com/apache/spark/pull/23043 , we introduced a behavior 
change: Spark users are not able to distinguish 0.0 and -0.0 anymore.
   
   This PR proposes an alternative fix to the original bug, to retain the 
difference between 0.0 and -0.0 inside Spark.
   
   The idea is, we can rewrite the window partition key, join key and grouping 
key during logical phase, to normalize the special floating numbers. Thus only 
operators care about special floating numbers need to pay the perf overhead, 
and end users can distinguish -0.0.
   
   ## How was this patch tested?
   
   existing test
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to