Github user holdenk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21416#discussion_r191486882
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/Column.scala ---
    @@ -786,6 +787,24 @@ class Column(val expr: Expression) extends Logging {
       @scala.annotation.varargs
       def isin(list: Any*): Column = withExpr { In(expr, 
list.map(lit(_).expr)) }
     
    +  /**
    +   * A boolean expression that is evaluated to true if the value of this 
expression is contained
    +   * by the provided collection.
    +   *
    +   * @group expr_ops
    +   * @since 2.4.0
    +   */
    +  def isInCollection(values: scala.collection.Iterable[_]): Column = 
isin(values.toSeq: _*)
    +
    +  /**
    +   * A boolean expression that is evaluated to true if the value of this 
expression is contained
    +   * by the provided collection.
    +   *
    +   * @group java_expr_ops
    +   * @since 2.4.0
    +   */
    +  def isInCollection(values: java.lang.Iterable[_]): Column = 
isInCollection(values.asScala)
    --- End diff --
    
    Not that we need it for sure, but in the past some of our Java APIs have 
been difficult to call from Java and I think that if were making an API 
designed to be called from Java it might make sense to have a test case for it 
in Java. Here I understand it's not super important, so just a suggestion.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to