gengliangwang commented on code in PR #38988:
URL: https://github.com/apache/spark/pull/38988#discussion_r1044100198
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala:
##########
@@ -1280,6 +1280,16 @@ case class Cast(
}
}
+ // Whether Spark SQL can evaluation the try_cast as the legacy cast, so that
no `try...catch`
+ // is needed and the performance can be faster.
+ private lazy val canUseLegacyCastForTryCast: Boolean = {
+ (child.dataType, dataType) match {
+ case (StringType, _: FractionalType) => true
+ case (StringType, _: DatetimeType) => true
+ case _ => false
Review Comment:
Using Cast.canUpCast is not safe here...Decimal(19,0) can be cast as long
since the decimal for long is Decimal(20, 0), while the max long value has only
19 digits.
I will revisit this part later.
On the other hand, if the cast can up cast to another type, adding the
`try...catch` won't significantly affect the perf.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]