zhengruifeng commented on code in PR #53161:
URL: https://github.com/apache/spark/pull/53161#discussion_r2674539569


##########
python/pyspark/worker.py:
##########
@@ -3304,8 +3305,12 @@ def main(infile, outfile):
             sys.exit(-1)
         start_faulthandler_periodic_traceback()
 
-        # Use the local timezone to convert the timestamp
-        tz = datetime.datetime.now().astimezone().tzinfo
+        tzname = os.environ.get("SPARK_SESSION_LOCAL_TIMEZONE", None)

Review Comment:
   > It can be enabled by default, but when disabled, the behavior should be 
the original behavior.
   
   @ueshin how to disable a runtime config with default value? it still returns 
the default value after `conf.unset`
   
   ```
   In [1]: spark.conf.get("spark.sql.session.timeZone")
   Out[1]: 'Asia/Shanghai'
   
   In [2]: spark.conf.unset("spark.sql.session.timeZone")
   
   In [3]: spark.conf.get("spark.sql.session.timeZone")
   Out[3]: 'Asia/Shanghai'
   
   In [4]: spark.conf.set("spark.sql.session.timeZone", "Asia/Tokyo")
   
   In [5]: spark.conf.get("spark.sql.session.timeZone")
   Out[5]: 'Asia/Tokyo'
   
   In [6]: spark.conf.unset("spark.sql.session.timeZone")
   
   In [7]: spark.conf.get("spark.sql.session.timeZone")
   Out[7]: 'Asia/Shanghai'
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to