This is an automated email from the ASF dual-hosted git repository.

yamamuro pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 8643e5d  [SPARK-31171][SQL][FOLLOWUP] update document
8643e5d is described below

commit 8643e5d9c50294f59b01988d99d447a38776178e
Author: Wenchen Fan <wenc...@databricks.com>
AuthorDate: Thu Mar 19 07:29:31 2020 +0900

    [SPARK-31171][SQL][FOLLOWUP] update document
    
    ### What changes were proposed in this pull request?
    
    A followup of https://github.com/apache/spark/pull/27936 to update document.
    
    ### Why are the changes needed?
    
    correct document
    
    ### Does this PR introduce any user-facing change?
    
    no
    
    ### How was this patch tested?
    
    N/A
    
    Closes #27950 from cloud-fan/null.
    
    Authored-by: Wenchen Fan <wenc...@databricks.com>
    Signed-off-by: Takeshi Yamamuro <yamam...@apache.org>
---
 docs/sql-ref-ansi-compliance.md                                    | 7 ++++++-
 .../spark/sql/catalyst/expressions/collectionOperations.scala      | 6 +++---
 sql/core/src/main/scala/org/apache/spark/sql/functions.scala       | 4 ++++
 3 files changed, 13 insertions(+), 4 deletions(-)

diff --git a/docs/sql-ref-ansi-compliance.md b/docs/sql-ref-ansi-compliance.md
index 27e60b4..bc5bde6 100644
--- a/docs/sql-ref-ansi-compliance.md
+++ b/docs/sql-ref-ansi-compliance.md
@@ -21,7 +21,7 @@ license: |
 
 Since Spark 3.0, Spark SQL introduces two experimental options to comply with 
the SQL standard: `spark.sql.ansi.enabled` and 
`spark.sql.storeAssignmentPolicy` (See a table below for details).
 
-When `spark.sql.ansi.enabled` is set to `true`, Spark SQL follows the standard 
in basic behaviours (e.g., arithmetic operations, type conversion, and SQL 
parsing).
+When `spark.sql.ansi.enabled` is set to `true`, Spark SQL follows the standard 
in basic behaviours (e.g., arithmetic operations, type conversion, SQL 
functions and SQL parsing).
 Moreover, Spark SQL has an independent option to control implicit casting 
behaviours when inserting rows in a table.
 The casting behaviours are defined as store assignment rules in the standard.
 
@@ -140,6 +140,11 @@ SELECT * FROM t;
 
 {% endhighlight %}
 
+### SQL Functions
+
+The behavior of some SQL functions can be different under ANSI mode 
(`spark.sql.ansi.enabled=true`).
+  - `size`: This function returns null for null input under ANSI mode.
+
 ### SQL Keywords
 
 When `spark.sql.ansi.enabled` is true, Spark SQL will use the ANSI mode parser.
diff --git 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
index 6d95909..8b61bc4 100644
--- 
a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
+++ 
b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
@@ -77,9 +77,9 @@ trait BinaryArrayExpressionWithImplicitCast extends 
BinaryExpression
 @ExpressionDescription(
   usage = """
     _FUNC_(expr) - Returns the size of an array or a map.
-    The function returns -1 if its input is null and 
spark.sql.legacy.sizeOfNull is set to true.
-    If spark.sql.legacy.sizeOfNull is set to false, the function returns null 
for null input.
-    By default, the spark.sql.legacy.sizeOfNull parameter is set to true.
+    The function returns null for null input if spark.sql.legacy.sizeOfNull is 
set to false or
+    spark.sql.ansi.enabled is set to true. Otherwise, the function returns -1 
for null input.
+    With the default settings, the function returns -1 for null input.
   """,
   examples = """
     Examples:
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/functions.scala 
b/sql/core/src/main/scala/org/apache/spark/sql/functions.scala
index 5603f20..6e189df 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/functions.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/functions.scala
@@ -3980,6 +3980,10 @@ object functions {
   /**
    * Returns length of array or map.
    *
+   * The function returns null for null input if spark.sql.legacy.sizeOfNull 
is set to false or
+   * spark.sql.ansi.enabled is set to true. Otherwise, the function returns -1 
for null input.
+   * With the default settings, the function returns -1 for null input.
+   *
    * @group collection_funcs
    * @since 1.5.0
    */


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to