tianhanhu-db commented on code in PR #40678:
URL: https://github.com/apache/spark/pull/40678#discussion_r1160368163


##########
sql/core/src/main/scala/org/apache/spark/sql/jdbc/PostgresDialect.scala:
##########
@@ -98,6 +100,14 @@ private object PostgresDialect extends JdbcDialect with 
SQLConfHelper {
     case _ => None
   }
 
+  override def convertJavaTimestampToTimestampNTZ(t: Timestamp): Long = {
+    DateTimeUtils.localDateTimeToMicros(t.toLocalDateTime)

Review Comment:
   I will give a concrete Postgres read example where the general 
implementation would fail.
   
   Say there is a Timestamp of "2023-04-05 08:00:00" stored in Postgres 
database and we want to read it as Spark TimestampNTZType from a TimeZone of 
America/Los_Angeles. The expected results would be "2023-04-05 08:00:00".
   
   When we do `PostgresDriver.getTimestamp`, what happens under the hood is 
that Postgres would use the default JVM TimeZone and create a Timestamp 
representing an instant of the wall clock in that time zone. Thus, the Java 
Timestamp effectively represents "2023-04-05 08:00:00 America/Los_Angeles". 
   
   With our general conversion, we will just store the underlining microseconds 
from epoch to represent the TimestampNTZType. This is problematic as when 
displaying the TimestampNTZType, we convert to a LocalDateTime using UTC as the 
time zone. This will give as an erroneous result of "2023-04-05 15:00:00".
   
   The Postgres specific conversion first convert the Java Timestamp to 
LocalDateTime before getting its underlining microseconds from epoch. This 
basically restores the Timestamp to represent "2023-04-05 08:00:00 UTC". Thus 
when converting back we get the correct result.
   
   For write it is the similar story. @cloud-fan @beliefer 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to