Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17429#discussion_r108030625
  
    --- Diff: python/pyspark/sql/functions.py ---
    @@ -1675,15 +1675,18 @@ def array(*cols):
     @since(1.5)
     def array_contains(col, value):
         """
    -    Collection function: returns True if the array contains the given 
value. The collection
    -    elements and value must be of the same type.
    +    Collection function: returns null if the array is null, true if the 
array contains the
    +    given value, and false otherwise.
    --- End diff --
    
    Other documentation use `true` rather than `True`. So, I matach this to 
`true`. I am willing to sweep if anyone feels this should be fixed.
    The reason I removed `The collection elements and value must be of the same 
type` is it seems we can provide other types that are implicitly castable.
    This is not documented in Scala/R too. So, I instead provided a doctest as 
an excpetion in the Python documentation.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to