Github user ueshin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20637#discussion_r212882312
  
    --- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/ExpressionEvalHelper.scala
 ---
    @@ -223,8 +223,9 @@ trait ExpressionEvalHelper extends 
GeneratorDrivenPropertyChecks with PlanTestBa
               }
             } else {
               val lit = InternalRow(expected, expected)
    +          val dtAsNullable = expression.dataType.asNullable
    --- End diff --
    
    I think we need `asNullable` there.
    From the example of `MapZipWith` with wrong nullabilities (before #22126), 
the test 
[HigherOrderFunctionsSuite.scala#L312-L314](https://github.com/apache/spark/blob/d7ef82c53c10e6953d077801045ba1438c6670ab/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/HigherOrderFunctionsSuite.scala#L312-L314)
 should apparently fail and with `asNullable` can detect it, whereas without 
`asNullable`, the test passes mistakenly.
    
    I can understand that we should fail with a NPE, but as for primitive types 
with wrong nullabilities (in case where it should be `true` but was `false` by 
mistake), we use the default value for the type, and create a wrong result 
instead of throwing a NPE. If without `asNullable`, the expected value will 
also be converted to the wrong value and the comparison will success between 
both wrong values, but if with `asNullable`, the expected value will stay the 
value as is without converting to the wrong value, then we can detect the wrong 
actual value.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to