gotocoding-DB commented on code in PR #48773:
URL: https://github.com/apache/spark/pull/48773#discussion_r1837959815


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/IntervalUtils.scala:
##########
@@ -785,10 +785,15 @@ object IntervalUtils extends SparkIntervalUtils {
       secs: Decimal): Long = {
     assert(secs.scale == 6, "Seconds fractional must have 6 digits for 
microseconds")
     var micros = secs.toUnscaledLong
-    micros = Math.addExact(micros, Math.multiplyExact(days, MICROS_PER_DAY))
-    micros = Math.addExact(micros, Math.multiplyExact(hours, MICROS_PER_HOUR))
-    micros = Math.addExact(micros, Math.multiplyExact(mins, MICROS_PER_MINUTE))
-    micros
+    try {
+      micros = Math.addExact(micros, Math.multiplyExact(days, MICROS_PER_DAY))
+      micros = Math.addExact(micros, Math.multiplyExact(hours, 
MICROS_PER_HOUR))
+      micros = Math.addExact(micros, Math.multiplyExact(mins, 
MICROS_PER_MINUTE))
+      micros
+    } catch {
+      case _: ArithmeticException =>
+        throw 
QueryExecutionErrors.withoutSuggestionIntervalArithmeticOverflowError()

Review Comment:
   @MaxGekk I think it's better to **fix it separately**, because now it's 
`object IntervalUtils extends SparkIntervalUtils`.
   And if I add `extends ... with SupportQueryContext`, I should implement a 
lot of members which comes with `extends Expression`:
   <img width="718" alt="image" 
src="https://github.com/user-attachments/assets/90a6da4d-3487-43b9-aa55-9f41361d18f0";>
   
   Is it ok for you? I agree to make other PR which adds SupportQueryContext to 
IntervalUtils.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to