Github user drewrobb commented on a diff in the pull request:

    https://github.com/apache/spark/pull/23062#discussion_r234861629
  
    --- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/ScalaReflectionSuite.scala
 ---
    @@ -109,6 +109,35 @@ object TestingUDT {
       }
     }
     
    +/** An example derived from Twitter/Scrooge codegen for thrift  */
    +object ScroogeLikeExample {
    +  def apply(x: Int): ScroogeLikeExample = new Immutable(x)
    +
    +  def unapply(_item: ScroogeLikeExample): Option[Int] = Some(_item.x)
    +
    +  class Immutable(val x: Int) extends ScroogeLikeExample
    +}
    +
    +trait ScroogeLikeExample extends Product1[Int] with Serializable {
    +  import ScroogeLikeExample._
    +
    +  def x: Int
    +
    +  override def _1: Int = x
    +
    +  def copy(x: Int = this.x): ScroogeLikeExample = new Immutable(x)
    +
    +  override def canEqual(other: Any): Boolean = 
other.isInstanceOf[ScroogeLikeExample]
    +
    +  private def _equals(x: ScroogeLikeExample, y: ScroogeLikeExample): 
Boolean =
    +      x.productArity == y.productArity &&
    --- End diff --
    
    I'm worried about changing the tests to use a concrete subtype, because the 
reflection calls might behave differently in that case either now or later on. 
I simplified a little more. `canEqual` is necessary to implement product. 
`equals` is necessary or tests will not pass (it will check object pointer 
equality), and `hashCode` is needed for scalastyle to pass since `equals` is 
necessary.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to