AngersZhuuuu commented on a change in pull request #30421:
URL: https://github.com/apache/spark/pull/30421#discussion_r584844688



##########
File path: docs/sql-migration-guide.md
##########
@@ -65,6 +65,8 @@ license: |
  
   - In Spark 3.2, the output schema of `SHOW TBLPROPERTIES` becomes `key: 
string, value: string` whether you specify the table property key or not. In 
Spark 3.1 and earlier, the output schema of `SHOW TBLPROPERTIES` is `value: 
string` when you specify the table property key. To restore the old schema with 
the builtin catalog, you can set `spark.sql.legacy.keepCommandOutputSchema` to 
`true`.
 
+  - In Spark 3.2, we support a typed literal for a partition constant value in 
a INSERT clause. For example, a right-side constant value in `PARTITION (dt = 
date'2020-01-01')` is parsed as a date-typed literal in the partition spec. In 
Spark 3.1 and earlier, the partition value will be treated as string value 
`date '2020-01-01'` and it's a illegal date type string value and will been 
converted to `__HIVE_DEFAULT_PARTITION__`.

Review comment:
       > let's update the migration guide as it's not only INSERT now.
   
   How about `we support a typed literal for a partition constant value in a 
partition spec clause`? it's more general.

##########
File path: docs/sql-ref-syntax-dml-insert-into.md
##########
@@ -41,7 +41,7 @@ INSERT INTO [ TABLE ] table_identifier [ partition_spec ] [ ( 
column_list ) ]
 * **partition_spec**
 
     An optional parameter that specifies a comma-separated list of key and 
value pairs
-    for partitions.
+    for partitions. Note that one can use a typed literal (e.g., 
date'2019-01-02') for a partition column value.

Review comment:
       Sure, done




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to