Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20962#discussion_r179043803
  
    --- Diff: python/pyspark/sql/functions.py ---
    @@ -87,7 +87,15 @@ def _():
         'col': 'Returns a :class:`Column` based on the given column name.',
         'column': 'Returns a :class:`Column` based on the given column name.',
         'asc': 'Returns a sort expression based on the ascending order of the 
given column name.',
    +    'asc_nulls_first': 'Returns a sort expression based on the ascending 
order of the given' +
    +                       ' column name, and null values return before 
non-null values.',
    +    'asc_nulls_last': 'Returns a sort expression based on the ascending 
order of the given' +
    +                      ' column name, and null values appear after non-null 
values.',
         'desc': 'Returns a sort expression based on the descending order of 
the given column name.',
    +    'desc_nulls_first': 'Returns a sort expression based on the descending 
order of the given' +
    +                        ' column name, and null values appear before 
non-null values.',
    +    'desc_nulls_last': 'Returns a sort expression based on the descending 
order of the given' +
    +                       ' column name, and null values appear after 
non-null values',
    --- End diff --
    
    I think you can make another
    
    ```
    _functions_2_4 = {
    ...
    ```
    
    and
    
    ```
     for _name, _doc in _functions_2_4.items():
         globals()[_name] = since(2.4)(_create_function(_name, _doc))
    ```



---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to