This is an automated email from the ASF dual-hosted git repository.

gengliang pushed a commit to branch branch-3.3
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.3 by this push:
     new ab057c72509 [SPARK-39237][DOCS] Update the ANSI SQL mode documentation
ab057c72509 is described below

commit ab057c72509006cbd8b501b6be4eb26793dc1e71
Author: Gengliang Wang <gengli...@apache.org>
AuthorDate: Fri May 20 16:58:22 2022 +0800

    [SPARK-39237][DOCS] Update the ANSI SQL mode documentation
    
    ### What changes were proposed in this pull request?
    
    1. Remove the Experimental notation in ANSI SQL compliance doc
    2. Update the description of `spark.sql.ansi.enabled`, since the ANSI 
reversed keyword is disabled by default now
    
    ### Why are the changes needed?
    
    1. The ANSI SQL dialect is GAed in Spark 3.2 release: 
https://spark.apache.org/releases/spark-release-3-2-0.html
    We should not mark it as "Experimental" in the doc.
    2. The ANSI reversed keyword is disabled by default now
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, just doc change
    ### How was this patch tested?
    
    Doc preview:
    <img width="700" alt="image" 
src="https://user-images.githubusercontent.com/1097932/169444094-de9c33c2-1b01-4fc3-b583-b752c71e16d8.png";>
    
    <img width="1435" alt="image" 
src="https://user-images.githubusercontent.com/1097932/169472239-1edf218f-1f7b-48ec-bf2a-5d043600f1bc.png";>
    
    Closes #36614 from gengliangwang/updateAnsiDoc.
    
    Authored-by: Gengliang Wang <gengli...@apache.org>
    Signed-off-by: Gengliang Wang <gengli...@apache.org>
    (cherry picked from commit 86a351c13d62644d596cc5249fc1c45d318a0bbf)
    Signed-off-by: Gengliang Wang <gengli...@apache.org>
---
 docs/sql-ref-ansi-compliance.md | 10 +++++-----
 1 file changed, 5 insertions(+), 5 deletions(-)

diff --git a/docs/sql-ref-ansi-compliance.md b/docs/sql-ref-ansi-compliance.md
index 94ef94a5e7b..c4572c71f4a 100644
--- a/docs/sql-ref-ansi-compliance.md
+++ b/docs/sql-ref-ansi-compliance.md
@@ -19,7 +19,7 @@ license: |
   limitations under the License.
 ---
 
-Since Spark 3.0, Spark SQL introduces two experimental options to comply with 
the SQL standard: `spark.sql.ansi.enabled` and 
`spark.sql.storeAssignmentPolicy` (See a table below for details).
+In Spark SQL, there are two options to comply with the SQL standard: 
`spark.sql.ansi.enabled` and `spark.sql.storeAssignmentPolicy` (See a table 
below for details).
 
 When `spark.sql.ansi.enabled` is set to `true`, Spark SQL uses an ANSI 
compliant dialect instead of being Hive compliant. For example, Spark will 
throw an exception at runtime instead of returning null results if the inputs 
to a SQL operator/function are invalid. Some ANSI dialect features may be not 
from the ANSI SQL standard directly, but their behaviors align with ANSI SQL's 
style.
 
@@ -28,10 +28,10 @@ The casting behaviours are defined as store assignment 
rules in the standard.
 
 When `spark.sql.storeAssignmentPolicy` is set to `ANSI`, Spark SQL complies 
with the ANSI store assignment rules. This is a separate configuration because 
its default value is `ANSI`, while the configuration `spark.sql.ansi.enabled` 
is disabled by default.
 
-|Property Name|Default|Meaning|Since Version|
-|-------------|-------|-------|-------------|
-|`spark.sql.ansi.enabled`|false|(Experimental) When true, Spark tries to 
conform to the ANSI SQL specification: <br/> 1. Spark will throw a runtime 
exception if an overflow occurs in any operation on integral/decimal field. 
<br/> 2. Spark will forbid using the reserved keywords of ANSI SQL as 
identifiers in the SQL parser.|3.0.0|
-|`spark.sql.storeAssignmentPolicy`|ANSI|(Experimental) When inserting a value 
into a column with different data type, Spark will perform type conversion.  
Currently, we support 3 policies for the type coercion rules: ANSI, legacy and 
strict. With ANSI policy, Spark performs the type coercion as per ANSI SQL. In 
practice, the behavior is mostly the same as PostgreSQL.  It disallows certain 
unreasonable type conversions such as converting string to int or double to 
boolean.  With legacy po [...]
+|Property Name|Default| Meaning                                                
                                                                                
                                                                                
                                                                                
                                                                                
                                                                                
              [...]
+|-------------|-------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
 [...]
+|`spark.sql.ansi.enabled`|false| When true, Spark tries to conform to the ANSI 
SQL specification: <br/> 1. Spark SQL will throw runtime exceptions on invalid 
operations, including integer overflow errors, string parsing errors, etc. 
<br/> 2. Spark will use different type coercion rules for resolving conflicts 
among data types. The rules are consistently based on data type precedence.     
                                                                                
                      [...]
+|`spark.sql.storeAssignmentPolicy`|ANSI| When inserting a value into a column 
with different data type, Spark will perform type conversion.  Currently, we 
support 3 policies for the type coercion rules: ANSI, legacy and strict.<br/> 
1. With ANSI policy, Spark performs the type coercion as per ANSI SQL. In 
practice, the behavior is mostly the same as PostgreSQL.  It disallows certain 
unreasonable type conversions such as converting string to int or double to 
boolean. On inserting a numeri [...]
 
 The following subsections present behaviour changes in arithmetic operations, 
type conversions, and SQL parsing when the ANSI mode enabled. For type 
conversions in Spark SQL, there are three kinds of them and this article will 
introduce them one by one: cast, store assignment and type coercion. 
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to