Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19692#discussion_r149639379
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala
 ---
    @@ -137,6 +137,8 @@ object TypeCoercion {
         case (DateType, TimestampType) => Some(StringType)
         case (StringType, NullType) => Some(StringType)
         case (NullType, StringType) => Some(StringType)
    +    case (n: NumericType, s: StringType) => Some(DoubleType)
    +    case (s: StringType, n: NumericType) => Some(DoubleType)
    --- End diff --
    
    We intentionally not to do it, see 
https://github.com/apache/spark/pull/15880 .
    
    But your use case is valid, I think casting string to the decimal type of 
the other side is too strict, how about
    ```
    case (_: StringType, d: DecimalType) => 
DecimalPrecision.widerDecimalType(DecimalType.SYSTEM_DEFAULT, d)
    ```
    
    cc @gatorsmile 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to