moritzkoerber commented on code in PR #36944:
URL: https://github.com/apache/spark/pull/36944#discussion_r917279175


##########
python/pyspark/sql/functions.py:
##########
@@ -2519,19 +2519,25 @@ def to_utc_timestamp(timestamp: "ColumnOrName", tz: 
"ColumnOrName") -> Column:
 
 def timestamp_seconds(col: "ColumnOrName") -> Column:
     """
+    Converts the number of seconds from the Unix epoch (1970-01-01T00:00:00Z)
+    to a timestamp.
+
     .. versionadded:: 3.1.0
 
     Examples
     --------
     >>> from pyspark.sql.functions import timestamp_seconds
-    >>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
+    >>> spark.conf.set("spark.sql.session.timeZone", "UTC")

Review Comment:
   I find that specifying a particular time zone might insiunate that the 
result depends on the set time zone, while it in fact only affects the 
`.show()`. No strong opinion here though, happy to revert if you want to keep 
the changes to a minimum.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to