JingsongLi commented on code in PR #6086:
URL: https://github.com/apache/paimon/pull/6086#discussion_r2281557533


##########
paimon-spark/paimon-spark-common/src/main/scala/org/apache/paimon/spark/commands/DeleteFromPaimonTableCommand.scala:
##########
@@ -147,10 +147,13 @@ case class DeleteFromPaimonTableCommand(
 
       // Step4: build a dataframe that contains the unchanged data, and write 
out them.
       val toRewriteScanRelation = Filter(Not(condition), newRelation)
-      val data = createDataset(sparkSession, toRewriteScanRelation)
+      var data = createDataset(sparkSession, toRewriteScanRelation)
+      if (coreOptions.rowTrackingEnabled()) {
+        data = selectWithRowLineageMetaCols(data)
+      }
 
       // only write new files, should have no compaction
-      val addCommitMessage = dvSafeWriter.writeOnly().write(data)
+      val addCommitMessage = dvSafeWriter.withRowLineage().write(data)

Review Comment:
   Should keep writeOnly?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@paimon.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to