Github user rxin commented on the issue:

    https://github.com/apache/spark/pull/23178
  
    Good idea to have it sealed!
    
    > On Nov 29, 2018, at 7:04 AM, Sean Owen <notificati...@github.com> wrote:
    > 
    > @srowen commented on this pull request.
    > 
    > In 
sql/core/src/main/scala/org/apache/spark/sql/expressions/UserDefinedFunction.scala:
    > 
    > >      if (inputTypes.isDefined) {
    >        assert(inputTypes.get.length == nullableTypes.get.length)
    >      }
    >  
    > +    val inputsNullSafe = if (nullableTypes.isEmpty) {
    > You can use getOrElse here and even inline this into the call below, but 
I don't really care.
    > 
    > In 
sql/core/src/main/scala/org/apache/spark/sql/expressions/UserDefinedFunction.scala:
    > 
    > > @@ -38,114 +38,108 @@ import org.apache.spark.sql.types.DataType
    >   * @since 1.3.0
    >   */
    >  @Stable
    > -case class UserDefinedFunction protected[sql] (
    > -    f: AnyRef,
    > -    dataType: DataType,
    > -    inputTypes: Option[Seq[DataType]]) {
    > -
    > -  private var _nameOption: Option[String] = None
    > -  private var _nullable: Boolean = true
    > -  private var _deterministic: Boolean = true
    > -
    > -  // This is a `var` instead of in the constructor for backward 
compatibility of this case class.
    > -  // TODO: revisit this case class in Spark 3.0, and narrow down the 
public surface.
    > -  private[sql] var nullableTypes: Option[Seq[Boolean]] = None
    > +trait UserDefinedFunction {
    > Should we make this sealed? I'm not sure. Would any user ever extend this 
meaningfully? I kind of worry someone will start doing so; maybe they already 
subclass it in some cases though. Elsewhere it might help the compiler 
understand in match statements that there is only ever one type of UDF class to 
match on.
    > 
    > —
    > You are receiving this because you were mentioned.
    > Reply to this email directly, view it on GitHub, or mute the thread.



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to