Github user MaxGekk commented on the issue:
https://github.com/apache/spark/pull/21736
I am testing the changes and have found this so far:
```
$ ./bin/spark-shell --master 'local-cluster[1, 1, 1024]'
```
By default the "legacy" behavior is enabled:
```
scala> spark.conf.get("spark.sql.legacy.sizeOfNull")
res0: String = true
scala> spark.sql("select size(null)").show()
+----------+
|size(NULL)|
+----------+
| -1|
+----------+
```
Let's store the sql query is a value:
```
scala> val df = spark.sql("select size(null)")
df: org.apache.spark.sql.DataFrame = [size(NULL): int]
```
and switch the behavior:
```
scala> spark.conf.set("spark.sql.legacy.sizeOfNull", "false")
scala> spark.conf.get("spark.sql.legacy.sizeOfNull")
res3: String = false
```
I would expect `null` but got `-1`:
```
scala> df.show()
+----------+
|size(NULL)|
+----------+
| -1|
+----------+
```
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]