[
https://issues.apache.org/jira/browse/SPARK-34564?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17292427#comment-17292427
]
Maxim Gekk commented on SPARK-34564:
------------------------------------
We changed the behavior intentionally because we do believe that it is better
to return an error instead of an incorrect result silently.
> However, the question is even if such late dates are not supported, could it
>fail in more gentle way?
How? What would you like to see?
> DateTimeUtils.fromJavaDate fails for very late dates during casting to Int
> --------------------------------------------------------------------------
>
> Key: SPARK-34564
> URL: https://issues.apache.org/jira/browse/SPARK-34564
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.0.1, 3.2.0, 3.1.2
> Reporter: kondziolka9ld
> Priority: Major
>
> Please consider a following scenario on *spark-3.0.1*:
> {code:java}
> scala> List(("some date", new Date(Int.MaxValue)), ("some corner case date",
> new Date(Long.MaxValue))).toDF
> java.lang.RuntimeException: Error while encoding:
> java.lang.ArithmeticException: integer overflow
> staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType,
> fromString, knownnotnull(assertnotnull(input[0, scala.Tuple2, true]))._1,
> true, false) AS _1#0
> staticinvoke(class org.apache.spark.sql.catalyst.util.DateTimeUtils$,
> DateType, fromJavaDate, knownnotnull(assertnotnull(input[0, scala.Tuple2,
> true]))._2, true, false) AS _2#1
> at
> org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$Serializer.apply(ExpressionEncoder.scala:215)
> at
> org.apache.spark.sql.SparkSession.$anonfun$createDataset$1(SparkSession.scala:466)
> at
> scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
> at scala.collection.immutable.List.foreach(List.scala:392)
> at scala.collection.TraversableLike.map(TraversableLike.scala:238)
> at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
> at scala.collection.immutable.List.map(List.scala:298)
> at org.apache.spark.sql.SparkSession.createDataset(SparkSession.scala:466)
> at org.apache.spark.sql.SQLContext.createDataset(SQLContext.scala:353)
> at
> org.apache.spark.sql.SQLImplicits.localSeqToDatasetHolder(SQLImplicits.scala:231)
> ... 51 elided
> Caused by: java.lang.ArithmeticException: integer overflow
> at java.lang.Math.toIntExact(Math.java:1011)
> at
> org.apache.spark.sql.catalyst.util.DateTimeUtils$.fromJavaDate(DateTimeUtils.scala:111)
> at
> org.apache.spark.sql.catalyst.util.DateTimeUtils.fromJavaDate(DateTimeUtils.scala)
> at
> org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown
> Source)
> at
> org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$Serializer.apply(ExpressionEncoder.scala:211)
> ... 60 more
> {code}
> In opposition to *spark-2.4.7* where it is possible to create dataframe with
> such values:
> {code:java}
> scala> val df = List(("some date", new Date(Int.MaxValue)), ("some corner
> case date", new Date(Long.MaxValue))).toDF
> df: org.apache.spark.sql.DataFrame = [_1: string, _2: date]scala> df.show
> +--------------------+-------------+
> | _1| _2|
> +--------------------+-------------+
> | some date| 1970-01-25|
> |some corner case ...|1701498-03-18|
> +--------------------+-------------+
> {code}
> Anyway, I am aware of the fact that during collecting these data I will got
> another result:
> {code:java}
> scala> df.collect
> res10: Array[org.apache.spark.sql.Row] = Array([some date,1970-01-25], [some
> corner case date,?498-03-18])
> {code}
> what seems to be natural because of behaviour of *java.sql.Date*:
> {code:java}
> scala> new java.sql.Date(Long.MaxValue)
> res1: java.sql.Date = ?994-08-17
> {code}
>
> ----
> When it comes to easier reproduction, please consider:
> {code:java}
> scala> org.apache.spark.sql.catalyst.util.DateTimeUtils.fromJavaDate(new
> java.sql.Date(Long.MaxValue))
> java.lang.ArithmeticException: integer overflow
> at java.lang.Math.toIntExact(Math.java:1011)
> at
> org.apache.spark.sql.catalyst.util.DateTimeUtils$.fromJavaDate(DateTimeUtils.scala:111)
> ... 47 elided
> {code}
> However, the question is even if such late dates are not supported, could it
> fail in more gentle way?
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]