Github user maropu commented on the issue:

    https://github.com/apache/spark/pull/21598
  
    It seems we don't any behaviour change in current pr (IIUC 
`spark.sql.legacy.sizeOfNull=true` keeps the current behaviour). If so, we 
don't need that update?
    
    But, is it okay to set true at this option by default? Based on the 
three-value logic, `size(null)=null` is more reasonable?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to