Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22227#discussion_r217563904
  
    --- Diff: python/pyspark/sql/functions.py ---
    @@ -1671,18 +1671,32 @@ def repeat(col, n):
     
     @since(1.5)
     @ignore_unicode_prefix
    -def split(str, pattern):
    +def split(str, pattern, limit=-1):
         """
    -    Splits str around pattern (pattern is a regular expression).
    +    Splits str around matches of the given pattern.
     
    -    .. note:: pattern is a string represent the regular expression.
    +    :param str: a string expression to split
    +    :param pattern: a string representing a regular expression. The regex 
string should be
    +                  a Java regular expression.
    +    :param limit: an integer which controls the number of times `pattern` 
is applied.
     
    -    >>> df = spark.createDataFrame([('ab12cd',)], ['s',])
    -    >>> df.select(split(df.s, '[0-9]+').alias('s')).collect()
    -    [Row(s=[u'ab', u'cd'])]
    +            * ``limit > 0``: The resulting array's length will not be more 
than `limit`, and the
    +                             resulting array's last entry will contain all 
input beyond the last
    +                             matched pattern.
    +            * ``limit <= 0``: `pattern` will be applied as many times as 
possible, and the resulting
    +                              array can be of any size.
    +
    +    .. versionchanged:: 3.0
    +       `split` now takes an optional `limit` field. If not provided, 
default limit value is -1.
    +
    +    >>> df = spark.createDataFrame([('oneAtwoBthreeC',)], ['s',])
    +    >>> df.select(split(df.s, '[ABC]', 2).alias('s')).collect()
    +    [Row(s=[u'one', u'twoBthreeC'])]
    +    >>> df.select(split(df.s, '[ABC]', -1).alias('s')).collect()
    --- End diff --
    
    Let's turn into this an example without limit argument.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to