Github user caneGuy commented on a diff in the pull request:
https://github.com/apache/spark/pull/20508#discussion_r166182782
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala
---
@@ -327,6 +327,14 @@ object TypeCoercion {
// Skip nodes who's children have not been resolved yet.
case e if !e.childrenResolved => e
+ // For integralType should not convert to double which will cause
precision loss.
+ case a @ BinaryArithmetic(left @ StringType(), right @
IntegralType()) =>
--- End diff --
Thanks @wangyum , it will return `NULL`.
I modify to use `DecimalType.SYSTEM_DEFAULT` instead.
I consider to check value, but i think `DecimalType.SYSTEM_DEFAULT` is
enough.What do you think about this?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]