HyukjinKwon commented on a change in pull request #30181:
URL: https://github.com/apache/spark/pull/30181#discussion_r514213238
##########
File path: python/pyspark/sql/column.py
##########
@@ -614,20 +706,30 @@ def isin(self, *cols):
isNull = _unary_op("isNull", _isNull_doc)
isNotNull = _unary_op("isNotNull", _isNotNull_doc)
- @since(1.3)
def alias(self, *alias, **kwargs):
"""
Returns this column aliased with a new name or names (in the case of
expressions that
return more than one column, such as explode).
- :param alias: strings of desired column names (collects all positional
arguments passed)
- :param metadata: a dict of information to be stored in ``metadata``
attribute of the
+ .. versionadded:: 1.3.0
+
+ Parameters
+ ----------
+ alias : str
+ strings of desired column names (collects all positional arguments
passed)
+
+ Other Parameters
Review comment:
This seems the way to document the parameters passed via `kwargs`, see
https://numpydoc.readthedocs.io/en/latest/format.html
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]