turboFei commented on issue #27151: [SPARK-26218][Follow up] Fix the corner case when casting float to Integer. URL: https://github.com/apache/spark/pull/27151#issuecomment-572610559 > Hm, but: > > ``` > scala> (Int.MaxValue.toFloat+1).toInt > res13: Int = 2147483647 > > scala> (Int.MaxValue.toFloat+1).toInt == Int.MaxValue > res14: Boolean = true > ``` > > Those values do correctly cast to an int. The cast does lose precision of course, but according to Scala/Java, the result is correct, no? Yes, the behavior is consistent with Scala/Java, it seems that if the value exceeds Int.Max, cast it to Int is Int.Max. But when spark.sql.ansi.enabled is true, we should throw exception to keep consistent with ansi.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
