dtenedor commented on a change in pull request #35690:
URL: https://github.com/apache/spark/pull/35690#discussion_r817924493



##########
File path: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/DDLParserSuite.scala
##########
@@ -1424,6 +1424,30 @@ class DDLParserSuite extends AnalysisTest {
     assert(exc.getMessage.contains("Columns aliases are not allowed in 
UPDATE."))
   }
 
+  private def basicMergeIntoCommand(firstValue: String, secondValue: String): 
String =

Review comment:
       Done. (I had considered refactoring some MERGE INTO unit testing to 
share the logical plans, but it turned out to be more confusing than expected, 
so I skipped that approach.)

##########
File path: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/DDLParserSuite.scala
##########
@@ -1424,6 +1424,30 @@ class DDLParserSuite extends AnalysisTest {
     assert(exc.getMessage.contains("Columns aliases are not allowed in 
UPDATE."))
   }
 
+  private def basicMergeIntoCommand(firstValue: String, secondValue: String): 
String =
+    """
+      |MERGE INTO testcat1.ns1.ns2.tbl AS target
+      |USING testcat2.ns1.ns2.tbl AS source
+      |ON target.col1 = source.col1
+      |WHEN MATCHED AND (target.col2='delete') THEN DELETE
+      |WHEN MATCHED AND (target.col2='update') THEN UPDATE SET target.col2 = 
source.col2
+      |WHEN NOT MATCHED AND (target.col2='insert')
+      |THEN INSERT (target.col1, target.col2) values (
+      """.stripMargin + firstValue + ", " + secondValue + ")"
+
+  private def basicMergeIntoPlan(firstValue: String, secondValue: String): 
LogicalPlan =

Review comment:
       Done.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to