MaxGekk opened a new pull request #31799:
URL: https://github.com/apache/spark/pull/31799
### What changes were proposed in this pull request?
In the PR, I propose to especially handle the amount of seconds
`-9223372036855` in `IntervalUtils. durationToMicros()`. Starting from the
amount (any durations with the second field < `-9223372036855`), input
durations cannot fit to `Long` in the conversion to microseconds. For example,
the amount of microseconds = `Long.MinValue = -9223372036854775808` can be
represented in two forms:
1. seconds = -9223372036854, nanoAdjustment = -775808, or
2. seconds = -9223372036855, nanoAdjustment = +224192
And the method `Duration.ofSeconds()` produces the last form but such form
causes overflow while converting `-9223372036855` seconds to microseconds.
In the PR, I propose to convert the second form to the first one if the
second field of input duration is equal to `-9223372036855`.
### Why are the changes needed?
The changes fix the issue demonstrated by the code:
```scala
scala> durationToMicros(microsToDuration(Long.MinValue))
java.lang.ArithmeticException: long overflow
at java.lang.Math.multiplyExact(Math.java:892)
at
org.apache.spark.sql.catalyst.util.IntervalUtils$.durationToMicros(IntervalUtils.scala:782)
... 49 elided
```
The `durationToMicros()` method cannot handle valid output of
`microsToDuration()`.
### Does this PR introduce _any_ user-facing change?
Should not since new interval types has not been released yet.
### How was this patch tested?
By running new UT from `IntervalUtilsSuite`.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]