AngersZhuuuu commented on a change in pull request #30421:
URL: https://github.com/apache/spark/pull/30421#discussion_r585415321
##########
File path: sql/core/src/test/scala/org/apache/spark/sql/SQLInsertTestSuite.scala
##########
@@ -209,6 +211,79 @@ trait SQLInsertTestSuite extends QueryTest with
SQLTestUtils {
}
}
+ test("SPARK-33474: Support typed literals as partition spec values") {
+ withTable("t1") {
+ val binaryStr = "Spark SQL"
+ val binaryHexStr =
Hex.hex(UTF8String.fromString(binaryStr).getBytes).toString
+ sql(
+ """
+ | CREATE TABLE t1(name STRING, part1 DATE, part2 TIMESTAMP, part3
BINARY,
+ | part4 STRING, part5 STRING, part6 STRING, part7 STRING)
+ | USING PARQUET PARTITIONED BY (part1, part2, part3, part4, part5,
part6, part7)
+ """.stripMargin)
+
+ sql(
+ s"""
+ | INSERT OVERWRITE t1 PARTITION(
Review comment:
Done
##########
File path: docs/sql-migration-guide.md
##########
@@ -65,6 +65,8 @@ license: |
- In Spark 3.2, the output schema of `SHOW TBLPROPERTIES` becomes `key:
string, value: string` whether you specify the table property key or not. In
Spark 3.1 and earlier, the output schema of `SHOW TBLPROPERTIES` is `value:
string` when you specify the table property key. To restore the old schema with
the builtin catalog, you can set `spark.sql.legacy.keepCommandOutputSchema` to
`true`.
+ - In Spark 3.2, we support typed literals in the partition spec of INSERT
and ADD/DROP/RENAME PARTITION. For example, `ADD PARTITION(dt =
date'2020-01-01')` adds a partition with date value `2020-01-01`. In Spark 3.1
and earlier, the partition value will be parsed as string value `date
'2020-01-01', which is an illegal date value, and we add a partition with null
value at the end.
Review comment:
Done
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]