GitHub user adoron opened a pull request:

    https://github.com/apache/spark/pull/23043

    [SPARK-26021][SQL] replace minus zero with zero in UnsafeProjection

    GROUP BY treats -0.0 and 0.0 as different values which is unlike hive's 
behavior.
    In addition current behavior with codegen is unpredictable (see example in 
JIRA ticket).
    
    ## What changes were proposed in this pull request?
    
    In BoundReference class, in the generated code, replace -0.0 with 0.0 if 
the data type is double or float.
    
    ## How was this patch tested?
    
    Added tests

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/adoron/spark 
adoron-spark-26021-replace-minus-zero-with-zero

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/23043.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #23043
    
----
commit ee0ef91a7047d47328efac753e66ec97a91c0e37
Author: Alon Doron <adoron@...>
Date:   2018-11-14T16:18:30Z

    replace -0.0 with 0.0 in BoundAttribute
    added tests

commit 63b7f59ad44d0876ea6dde02e4204fc0140d0df6
Author: Alon Doron <adoron@...>
Date:   2018-11-14T16:27:24Z

    minor remove var type

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to