Github user sadhen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22264#discussion_r214064282
  
    --- Diff: sql/core/src/test/scala/org/apache/spark/sql/QueryTest.scala ---
    @@ -290,6 +290,16 @@ object QueryTest {
         Row.fromSeq(row.toSeq.map {
           case null => null
           case d: java.math.BigDecimal => BigDecimal(d)
    +      // Equality of WrappedArray differs for AnyVal and AnyRef in Scala 
2.12.2+
    +      case seq: Seq[_] => seq.map {
    --- End diff --
    
    This is the full list from Row.scala
    ```
       *   BooleanType -> java.lang.Boolean
       *   ByteType -> java.lang.Byte
       *   ShortType -> java.lang.Short
       *   IntegerType -> java.lang.Integer
       *   LongType -> java.lang.Long
       *   FloatType -> java.lang.Float
       *   DoubleType -> java.lang.Double
       *   StringType -> String
       *   DecimalType -> java.math.BigDecimal
       *
       *   DateType -> java.sql.Date
       *   TimestampType -> java.sql.Timestamp
       *
       *   BinaryType -> byte array
       *   ArrayType -> scala.collection.Seq (use getList for java.util.List)
       *   MapType -> scala.collection.Map (use getJavaMap for java.util.Map)
       *   StructType -> org.apache.spark.sql.Row
    ```
    
    Byte, Short, Integer, Long,Float, Double are handled because they extends 
`java.lang.Number`.
    
    Equality for `AnyVal`s is specially handled.
    
    e.g. In `Short.scala`:
    
    ```
      def ==(x : scala.Byte) : scala.Boolean
      def ==(x : scala.Short) : scala.Boolean
      def ==(x : scala.Char) : scala.Boolean
      def ==(x : scala.Int) : scala.Boolean
      def ==(x : scala.Long) : scala.Boolean
      def ==(x : scala.Float) : scala.Boolean
      def ==(x : scala.Double) : scala.Boolean
    ```
    
    And scala.Char is not a value in the spark Row.



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to