LuciferYang commented on code in PR #45716:
URL: https://github.com/apache/spark/pull/45716#discussion_r1542548283


##########
sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/SparkDateTimeUtils.scala:
##########
@@ -215,8 +224,10 @@ trait SparkDateTimeUtils {
     val rebasedDays = rebaseGregorianToJulianDays(days)
     val localMillis = Math.multiplyExact(rebasedDays, MILLIS_PER_DAY)
     val timeZoneOffset = TimeZone.getDefault match {
-      case zoneInfo: ZoneInfo => zoneInfo.getOffsetsByWall(localMillis, null)
-      case timeZone: TimeZone => timeZone.getOffset(localMillis - 
timeZone.getRawOffset)
+      case zoneInfo: TimeZone if zoneInfo.getClass.getName == 
zoneInfoClassName =>

Review Comment:
   for fix:
   
   ```
   Error: ] 
/home/runner/work/spark/spark/sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/SparkDateTimeUtils.scala:27:
 object util is not a member of package sun
   Error: ] 
/home/runner/work/spark/spark/sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/SparkDateTimeUtils.scala:218:
 not found: type ZoneInfo
   Error: ] 
/home/runner/work/spark/spark/sql/api/src/main/scala/org/apache/spark/sql/catalyst/util/SparkDateTimeUtils.scala:218:
 value getOffsetsByWall is not a member of java.util.TimeZone
   ```
   
   Maybe we can just use `val timeZoneOffset = 
TimeZone.getDefault.getOffset(localMillis)` ,  but I'm not sure which case can 
verify the compatibility issue with 2.4



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to