Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/7398#discussion_r34707783
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/types/AbstractDataType.scala 
---
    @@ -114,6 +114,15 @@ private[sql] object TypeCollection {
         BooleanType,
         ByteType, ShortType, IntegerType, LongType)
     
    +  /**
    +   * Types that include numeric types and interval type. They are only 
used in unary_minus,
    +   * unary_positive, add and subtract operations.
    +   */
    +  val NumericAndInterval = TypeCollection(
    +    ByteType, ShortType, IntegerType, LongType,
    +    FloatType, DoubleType, DecimalType,
    +    IntervalType)
    --- End diff --
    
    Actually we should use `NumericType` here.
    Think about `2 + "2"`, before this PR, we will first cast `"2"` to 
`2.0`(double is the default for numeric type), and then the result should be 
double 4.0. After this PR, we will first cast `"2"` to `2`(byte is the default 
for your type collection), and thus make the result of type int.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to