HyukjinKwon commented on a change in pull request #28593:
URL: https://github.com/apache/spark/pull/28593#discussion_r437982378
##########
File path: python/pyspark/sql/functions.py
##########
@@ -1427,6 +1427,19 @@ def to_utc_timestamp(timestamp, tz):
return
Column(sc._jvm.functions.to_utc_timestamp(_to_java_column(timestamp), tz))
+@since(3.1)
+def timestamp_seconds(col):
Review comment:
There are two (automatic-ish?) ways:
- Using `expr(...)`. I think we usually recommend this way when the SQL
functions are not existent in `functions.scala`.
- We can simply add this into [this
dictionary](https://github.com/apache/spark/blob/c7f2a9b323c5354c5dab1354c9a9bda19274dcdc/python/pyspark/sql/functions.py#L130-L136).
This way is currently kind of discouraged. I was discussed in the mailing list
before, and we should convert that dictionary into each function definition for
better static analysis and IDE support.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]