Kwafoor commented on a change in pull request #34862:
URL: https://github.com/apache/spark/pull/34862#discussion_r767385081



##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala
##########
@@ -652,7 +652,13 @@ abstract class CastBase extends UnaryExpression with 
TimeZoneAwareExpression wit
       buildCast[UTF8String](_, UTF8StringUtils.toIntExact)
     case StringType =>
       val result = new IntWrapper()
-      buildCast[UTF8String](_, s => if (s.toInt(result)) result.value else 
null)
+      buildCast[UTF8String](_, s => {

Review comment:
       I find this case in user change their HiveSQL to SparkSQL, user always  
filter column with comparing String and IntegerType.With ANSI this case is 
right, SparkSQL will thrown Exception, but user unwilling to change their code, 
so ANSI is often false.And in this case, it is actually return wrong result, I 
think SparkSQL should remind user where your sql is wrong.If user runs a big 
and complex query, without thrown exception it is hard to find where the sql is 
wrong.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to