Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/10745#discussion_r49902295
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/HyperLogLogPlusPlus.scala
 ---
    @@ -447,6 +447,7 @@ object HyperLogLogPlusPlus {
     
       private def validateDoubleLiteral(exp: Expression): Double = exp match {
         case Literal(d: Double, DoubleType) => d
    +    case Literal(dec: Decimal, dt: DecimalType) => dec.toDouble
    --- End diff --
    
    I changed this because the old SQL parser was passing Decimal to 
HyperLogLogPlusPlus.
    
    The pattern match eliminates the possibility of a NPE. Spark ```Decimal``` 
delegates the ```toDouble``` call to java ```BigDecimal``` which will return 
```(-)Infinity``` if the decimal is outside of the double range. We should be 
good here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to