[ 
https://issues.apache.org/jira/browse/SPARK-54836?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated SPARK-54836:
-----------------------------------
    Labels: pull-request-available  (was: )

> Include timestamp value in ArithmeticException for timestamp overflow errors
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-54836
>                 URL: https://issues.apache.org/jira/browse/SPARK-54836
>             Project: Spark
>          Issue Type: Task
>          Components: SQL
>    Affects Versions: 4.1.0
>            Reporter: Ganesha S
>            Priority: Major
>              Labels: pull-request-available
>
> Currently, when a timestamp overflow occurs during interval addition 
> operations, Spark throws a generic ArithmeticException with only "long 
> overflow" message, making it difficult for users to debug which timestamp 
> values caused the issue.
>  
> {code:java}
> java.lang.ArithmeticException: long overflow
>         at java.base/java.lang.Math.addExact(Math.java:903)
>         at 
> org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.instantToMicros(SparkDateTimeUtils.scala:144)
>         at 
> org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.instantToMicros$(SparkDateTimeUtils.scala:137)
>         at 
> org.apache.spark.sql.catalyst.util.DateTimeUtils$.instantToMicros(DateTimeUtils.scala:41)
>         at 
> org.apache.spark.sql.catalyst.util.DateTimeUtils$.timestampAddInterval(DateTimeUtils.scala:319)
>         at 
> org.apache.spark.sql.catalyst.util.DateTimeUtils.timestampAddInterval(DateTimeUtils.scala)
>         at 
> org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificPredicate.subExpr_13$(Unknown
>  Source)
>         at 
> org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificPredicate.eval(Unknown
>  Source)
>         at 
> org.apache.spark.sql.execution.FilterEvaluatorFactory$FilterPartitionEvaluator.$anonfun$eval$1(FilterEvaluatorFactory.scala:42)
>         at 
> org.apache.spark.sql.execution.FilterEvaluatorFactory$FilterPartitionEvaluator.$anonfun$eval$1$adapted(FilterEvaluatorFactory.scala:41)
>         at scala.collection.Iterator$$anon$6.hasNext(Iterator.scala:479)
>         at scala.collection.Iterator$$anon$9.hasNext(Iterator.scala:583)
>         at 
> org.apache.spark.sql.execution.collect.UnsafeRowBatchUtils$.$anonfun$encodeUnsafeRows$5(UnsafeRowBatchUtils.scala:88)
>  {code}
> We should enhance the error message to include a timestamp value in the 
> ArithmeticException for timestamp overflow errors, facilitating easier 
> debugging. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to