maropu commented on a change in pull request #29731:
URL: https://github.com/apache/spark/pull/29731#discussion_r487696800
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala
##########
@@ -658,6 +658,41 @@ abstract class CastBase extends UnaryExpression with
TimeZoneAwareExpression wit
}
}
+ private[this] def stringToDecimal(str: UTF8String, decimalType:
DecimalType): Decimal = {
+ val bigDecimal = try {
+ // According the benchmark test, `s.toString.trim` is much faster than
`s.trim.toString`.
+ // Please refer to https://github.com/apache/spark/pull/26640
+ new JavaBigDecimal(str.toString.trim)
+ } catch {
+ case _: NumberFormatException =>
+ if (ansiEnabled) {
+ throw new NumberFormatException(s"invalid input syntax for type
numeric: $str")
+ } else {
+ null
+ }
+ }
+
+ if (bigDecimal != null) {
+ val precision = if (bigDecimal.scale < 0) {
Review comment:
The scale can be negative even when converting it from a string? Even if
yes, this behaivour depends on `LEGACY_ALLOW_NEGATIVE_SCALE_OF_DECIMAL_ENABLED`.
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala
##########
@@ -658,6 +658,41 @@ abstract class CastBase extends UnaryExpression with
TimeZoneAwareExpression wit
}
}
+ private[this] def stringToDecimal(str: UTF8String, decimalType:
DecimalType): Decimal = {
+ val bigDecimal = try {
+ // According the benchmark test, `s.toString.trim` is much faster than
`s.trim.toString`.
+ // Please refer to https://github.com/apache/spark/pull/26640
+ new JavaBigDecimal(str.toString.trim)
+ } catch {
+ case _: NumberFormatException =>
+ if (ansiEnabled) {
+ throw new NumberFormatException(s"invalid input syntax for type
numeric: $str")
+ } else {
+ null
+ }
+ }
+
+ if (bigDecimal != null) {
+ val precision = if (bigDecimal.scale < 0) {
+ bigDecimal.precision - bigDecimal.scale
+ } else {
+ bigDecimal.precision
+ }
+
+ if (precision > DecimalType.MAX_PRECISION) {
Review comment:
Please leave some comments here about why we need this redundant check?
##########
File path:
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/CastSuite.scala
##########
@@ -1405,4 +1426,35 @@ class AnsiCastSuite extends CastSuiteBase {
checkEvaluation(cast(negativeTs, LongType), expectedSecs)
}
}
+
+ test("Fast fail for cast string type to decimal type in ansi mode") {
+ checkEvaluation(cast("12345678901234567890123456789012345678",
DecimalType(38, 0)),
+ Decimal("12345678901234567890123456789012345678"))
Review comment:
Is this behaivour different between ANSI enabled/disabled? I think this
test should only include specific behaivours in the ANSI mode.
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala
##########
@@ -670,18 +705,7 @@ abstract class CastBase extends UnaryExpression with
TimeZoneAwareExpression wit
private[this] def castToDecimal(from: DataType, target: DecimalType): Any =>
Any = from match {
case StringType =>
- buildCast[UTF8String](_, s => try {
- // According the benchmark test, `s.toString.trim` is much faster
than `s.trim.toString`.
- // Please refer to https://github.com/apache/spark/pull/26640
- changePrecision(Decimal(new JavaBigDecimal(s.toString.trim)), target)
- } catch {
- case _: NumberFormatException =>
- if (ansiEnabled) {
- throw new NumberFormatException(s"invalid input syntax for type
numeric: $s")
- } else {
- null
- }
- })
+ buildCast[UTF8String](_, s => stringToDecimal(s, target))
Review comment:
nit: How about `buildCast[UTF8String](_, stringToDecimal(_, target))`?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]