advancedxy commented on a change in pull request #23812:
URL: https://github.com/apache/spark/pull/23812#discussion_r562406723
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala
##########
@@ -67,13 +66,9 @@ object DateTimeUtils {
def defaultTimeZone(): TimeZone = TimeZone.getDefault()
- private val computedTimeZones = new ConcurrentHashMap[String, TimeZone]
- private val computeTimeZone = new JFunction[String, TimeZone] {
- override def apply(timeZoneId: String): TimeZone =
TimeZone.getTimeZone(timeZoneId)
- }
-
def getTimeZone(timeZoneId: String): TimeZone = {
- computedTimeZones.computeIfAbsent(timeZoneId, computeTimeZone)
+ val zoneId = ZoneId.of(timeZoneId, ZoneId.SHORT_IDS)
Review comment:
Hi @MaxGekk after upgrading Spark 2.3 to Spark3.0, we found this
behaviour change are rejecting some valid timeZoneIds, for example
```
// GMT+8:00 is a valid timezone if parsed from
TimeZone.getTimeZone("GMT+8:00")
// However, ZoneId.of("GMT+8:00", ZoneId.SHORT_IDS) are rejected with an
exception
from_unix_time("2020-01-01 10:00:00", "GMT+8:00")
```
what do you think about support these kind of timezones, such as `GMT+8:00`?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]