maropu commented on a change in pull request #27590: 
[SPARK-30703][SQL][DOCS][FollowUp] Declare the ANSI SQL compliance options as 
experimental
URL: https://github.com/apache/spark/pull/27590#discussion_r379816593
 
 

 ##########
 File path: docs/sql-ref-ansi-compliance.md
 ##########
 @@ -19,19 +19,21 @@ license: |
   limitations under the License.
 ---
 
-Spark SQL has two options to comply with the SQL standard: 
`spark.sql.ansi.enabled` and `spark.sql.storeAssignmentPolicy` (See a table 
below for details).
+Since Spark 3.0, Spark SQL introduces two experimental options to comply with 
the SQL standard: `spark.sql.ansi.enabled` and 
`spark.sql.storeAssignmentPolicy` (See a table below for details).
+
 When `spark.sql.ansi.enabled` is set to `true`, Spark SQL follows the standard 
in basic behaviours (e.g., arithmetic operations, type conversion, and SQL 
parsing).
 Moreover, Spark SQL has an independent option to control implicit casting 
behaviours when inserting rows in a table.
 The casting behaviours are defined as store assignment rules in the standard.
-When `spark.sql.storeAssignmentPolicy` is set to `ANSI`, Spark SQL complies 
with the ANSI store assignment rules.
+
+When `spark.sql.storeAssignmentPolicy` is set to `ANSI`, Spark SQL complies 
with the ANSI store assignment rules. This is a separate configuration because 
its default value is `ANSI`, while the configuration `spark.sql.ansi.enabled` 
is disabled by default.
 
 <table class="table">
 <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
 <tr>
   <td><code>spark.sql.ansi.enabled</code></td>
   <td>false</td>
   <td>
-    When true, Spark tries to conform to the ANSI SQL specification:
+    (Experimental) When true, Spark tries to conform to the ANSI SQL 
specification:
 
 Review comment:
   What does `Experimental` mean? That is the same with the `@Experimental` 
annotation? Anyway, this statement is just copied from one in the SQL 
configuration document (https://github.com/apache/spark/pull/27459). So, 
instead of adding the prefix `(Experimental)` manually, I personally think its 
better to add this prefix automatically via `sql/gen-sql-config-docs.py`. For 
example, how about adding an `experimental` method in `ConfigBuilder`;
   ```
     val ANSI_ENABLED = buildConf("spark.sql.ansi.enabled")
       .experimental()
       .doc("When true, Spark tries to conform to the ANSI SQL specification: 
1. Spark will " +
         "throw a runtime exception if an overflow occurs in any operation on 
integral/decimal " +
         "field. 2. Spark will forbid using the reserved keywords of ANSI SQL 
as identifiers in " +
         "the SQL parser.")
       .booleanConf
       .createWithDefault(false)
   ```
   Then, the script adds `(Experimental)` in the head of its description.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to