aokolnychyi commented on a change in pull request #2116:
URL: https://github.com/apache/iceberg/pull/2116#discussion_r560560752



##########
File path: 
spark3-extensions/src/main/scala/org/apache/spark/sql/catalyst/optimizer/RewriteMergeInto.scala
##########
@@ -58,53 +61,138 @@ case class RewriteMergeInto(conf: SQLConf) extends 
Rule[LogicalPlan] with Rewrit
 
   override def apply(plan: LogicalPlan): LogicalPlan = {
     plan resolveOperators {
+      case MergeIntoTable(target: DataSourceV2Relation, source: LogicalPlan, 
cond, matchedActions, notMatchedActions)
+          if matchedActions.isEmpty =>
+
+        val mergeBuilder = target.table.asMergeable.newMergeBuilder("merge", 
newWriteInfo(target.schema))
+        val targetTableScan = buildSimpleScanPlan(target.table, target.output, 
mergeBuilder, cond)
+
+        // when there are no matched actions, use a left anti join to remove 
any matching rows and rewrite to use
+        // append instead of replace. only unmatched source rows are passed to 
the merge and actions are all inserts.
+        val joinPlan = Join(source, targetTableScan, LeftAnti, Some(cond), 
JoinHint.NONE)

Review comment:
       +1 on letting Spark do the estimation whether the table can be 
broadcasted. I am going to resolve this thread to simply the review. Please, 
reopen if needed.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to