nsivabalan commented on code in PR #8875:
URL: https://github.com/apache/hudi/pull/8875#discussion_r1228768022
##########
hudi-spark-datasource/hudi-spark/src/test/scala/org/apache/hudi/functional/TestSqlStatement.scala:
##########
@@ -36,13 +36,15 @@ class TestSqlStatement extends HoodieSparkSqlTestBase {
val STATE_FINISH_ALL = 12
test("Test Sql Statements") {
- Seq("cow", "mor").foreach { tableType =>
- withTempDir { tmp =>
- val params = Map(
- "tableType" -> tableType,
- "tmpDir" -> tmp.getCanonicalPath
- )
- execSqlFile("/sql-statements.sql", params)
+ withSQLConf("hoodie.datasource.write.operation" -> "upsert") {
Review Comment:
why we have to fix this?
##########
hudi-spark-datasource/hudi-spark/src/test/scala/org/apache/spark/sql/hudi/TestAlterTableDropPartition.scala:
##########
@@ -399,7 +399,9 @@ class TestAlterTableDropPartition extends
HoodieSparkSqlTestBase {
// drop 2021-10-01 partition
spark.sql(s"alter table $tableName drop partition (year='2021',
month='10', day='01')")
- spark.sql(s"""insert into $tableName values (2, "l4", "v1", "2021",
"10", "02")""")
+ withSQLConf("hoodie.datasource.write.operation" -> "upsert") {
Review Comment:
can we add comments on why we had to do this? I am assuming you are
explicitly setting the op type to "upsert" only where it makes sense and not
just to make the test case pass. Just wanted to ensure we don't unintentionally
hide any bugs by doing this
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]