AngersZhuuuu commented on a change in pull request #30421:
URL: https://github.com/apache/spark/pull/30421#discussion_r585199108



##########
File path: sql/core/src/test/scala/org/apache/spark/sql/SQLInsertTestSuite.scala
##########
@@ -209,6 +211,72 @@ trait SQLInsertTestSuite extends QueryTest with 
SQLTestUtils {
     }
   }
 
+  test("SPARK-33474: Support typed literals as partition spec values") {
+    withTable("t1", "t3") {

Review comment:
       > did we create t3?
   
   pushed failed yesterday, updated now

##########
File path: docs/sql-ref-syntax-ddl-alter-table.md
##########
@@ -126,7 +126,7 @@ ALTER TABLE table_identifier ADD [IF NOT EXISTS]
 
 * **partition_spec**
 
-    Partition to be added.
+    Partition to be added. Note that one can use a typed literal (e.g., 
date'2019-01-02') for a partition column value.

Review comment:
       Done

##########
File path: docs/sql-ref-syntax-ddl-alter-table.md
##########
@@ -49,7 +49,7 @@ ALTER TABLE table_identifier partition_spec RENAME TO 
partition_spec
 
 * **partition_spec**
 
-    Partition to be renamed.
+    Partition to be renamed. Note that one can use a typed literal (e.g., 
date'2019-01-02') for a partition column value.

Review comment:
       > Note that one can use types literals (e.g. ...) in the partition spec.
   
   Done

##########
File path: docs/sql-migration-guide.md
##########
@@ -65,6 +65,8 @@ license: |
  
   - In Spark 3.2, the output schema of `SHOW TBLPROPERTIES` becomes `key: 
string, value: string` whether you specify the table property key or not. In 
Spark 3.1 and earlier, the output schema of `SHOW TBLPROPERTIES` is `value: 
string` when you specify the table property key. To restore the old schema with 
the builtin catalog, you can set `spark.sql.legacy.keepCommandOutputSchema` to 
`true`.
 
+  - In Spark 3.2, we support a typed literal for a partition constant value in 
a partition spec clause. For example, a right-side constant value in `PARTITION 
(dt = date'2020-01-01')` is parsed as a date-typed literal in the partition 
spec. In Spark 3.1 and earlier, the partition value will be treated as string 
value `date '2020-01-01'` and it's a illegal date type string value and will be 
converted to `__HIVE_DEFAULT_PARTITION__`.

Review comment:
       Done




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to