gengliangwang commented on a change in pull request #33413:
URL: https://github.com/apache/spark/pull/33413#discussion_r678108718
##########
File path:
external/avro/src/main/scala/org/apache/spark/sql/avro/AvroDeserializer.scala
##########
@@ -147,6 +147,21 @@ private[sql] class AvroDeserializer(
s"Avro logical type $other cannot be converted to SQL type
${TimestampType.sql}.")
}
+ case (LONG, TimestampNTZType) => avroType.getLogicalType match {
+ // For backward compatibility, if the Avro type is Long and it is not
logical type
+ // (the `null` case), the value is processed as timestamp without time
zone type
+ // with millisecond precision.
+ case null | _: LocalTimestampMillis => (updater, ordinal, value) =>
+ val millis = value.asInstanceOf[Long]
+ val micros = DateTimeUtils.millisToMicros(millis)
+ updater.setLong(ordinal, timestampRebaseFunc(micros))
Review comment:
Hmmm, the TimestampNTZ type is independent of time zones offsets. On
read/write as long type, it is treated as in UTC time zone. So it seems we
don't need the rebase.
We need to figure this out so that we can follow it in other data source
support works.
cc @cloud-fan as well
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]