cloud-fan commented on a change in pull request #33413:
URL: https://github.com/apache/spark/pull/33413#discussion_r678816260
##########
File path:
external/avro/src/main/scala/org/apache/spark/sql/avro/AvroDeserializer.scala
##########
@@ -147,6 +147,21 @@ private[sql] class AvroDeserializer(
s"Avro logical type $other cannot be converted to SQL type
${TimestampType.sql}.")
}
+ case (LONG, TimestampNTZType) => avroType.getLogicalType match {
+ // For backward compatibility, if the Avro type is Long and it is not
logical type
+ // (the `null` case), the value is processed as timestamp without time
zone type
+ // with millisecond precision.
+ case null | _: LocalTimestampMillis => (updater, ordinal, value) =>
+ val millis = value.asInstanceOf[Long]
+ val micros = DateTimeUtils.millisToMicros(millis)
+ updater.setLong(ordinal, timestampRebaseFunc(micros))
Review comment:
It goes without saying that a modern system always uses Proleptic
Gregorian calendar. The rebase is only used to keep backward compatibility with
legacy Spark versions. This is a new data type and there is no backward
compatibility issues.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]