MaxGekk commented on a change in pull request #34675:
URL: https://github.com/apache/spark/pull/34675#discussion_r758719068



##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
##########
@@ -59,7 +59,10 @@ trait TimeZoneAwareExpression extends Expression {
   /** Returns a copy of this expression with the specified timeZoneId. */
   def withTimeZone(timeZoneId: String): TimeZoneAwareExpression
 
-  @transient lazy val zoneId: ZoneId = DateTimeUtils.getZoneId(timeZoneId.get)
+  @transient lazy val zoneId: ZoneId = timeZoneId match {
+    case Some(x) => DateTimeUtils.getZoneId(x)
+    case None => TimeZone.getDefault.toZoneId

Review comment:
       Don't think we should use the default JVM time zone. At least, 
`spark.sql.session.timeZone`.

##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
##########
@@ -59,7 +59,10 @@ trait TimeZoneAwareExpression extends Expression {
   /** Returns a copy of this expression with the specified timeZoneId. */
   def withTimeZone(timeZoneId: String): TimeZoneAwareExpression
 
-  @transient lazy val zoneId: ZoneId = DateTimeUtils.getZoneId(timeZoneId.get)

Review comment:
       If you are ready to use internal stuff, just set the time zone manually:
   ```scala
   scala> 
hour(current_timestamp).expr.asInstanceOf[TimeZoneAwareExpression].withTimeZone("UTC").eval()
   res6: Any = 20
   
   scala> 
hour(current_timestamp).expr.asInstanceOf[TimeZoneAwareExpression].withTimeZone("Europe/Moscow").eval()
   res7: Any = 23
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to