amaliujia commented on code in PR #46778:
URL: https://github.com/apache/spark/pull/46778#discussion_r1618185941


##########
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/SQLQuerySuite.scala:
##########
@@ -1108,35 +1110,31 @@ abstract class SQLQuerySuiteBase extends QueryTest with 
SQLTestUtils with TestHi
   }
 
   test("dynamic partition value test") {
-    try {
-      sql("set hive.exec.dynamic.partition.mode=nonstrict")
-      // date
-      sql("drop table if exists dynparttest1")
-      sql("create table dynparttest1 (value int) partitioned by (pdate date)")
-      sql(
-        """
-          |insert into table dynparttest1 partition(pdate)
-          | select count(*), cast('2015-05-21' as date) as pdate from src
-        """.stripMargin)
-      checkAnswer(
-        sql("select * from dynparttest1"),
-        Seq(Row(500, java.sql.Date.valueOf("2015-05-21"))))
+    withTable("dynparttest1", "dynparttest2") {
+      withSQLConf("hive.exec.dynamic.partition.mode" -> "strict") {
+        sql("set hive.exec.dynamic.partition.mode=nonstrict")

Review Comment:
   That is the purpose of using `withSQLConfig`. So each test case should set 
the configs it needs and restores only to default values.
   
   A test case should not pollute other test cases nor care about starting 
values of confs of other test cases.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to