Github user MaxGekk commented on a diff in the pull request:
https://github.com/apache/spark/pull/21598#discussion_r196888409
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
@@ -1314,6 +1314,13 @@ object SQLConf {
"Other column values can be ignored during parsing even if they are
malformed.")
.booleanConf
.createWithDefault(true)
+
+ val LEGACY_SIZE_OF_NULL = buildConf("spark.sql.legacy.sizeOfNull")
+ .internal()
+ .doc("If it is set to true, size of null returns -1. This is legacy
behavior of Hive. " +
--- End diff --
I will change the sentence. It seems it is not clear. I just wanted to say
that Spark inherited the behavior from Hive. When the `size()` was implemented,
Hive's `size(null)` returns `-1`. Most likely Hive still has the behavior at
the moment.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]