cloud-fan commented on code in PR #53215:
URL: https://github.com/apache/spark/pull/53215#discussion_r2563083280


##########
sql/core/src/main/scala/org/apache/spark/sql/classic/DataFrameWriter.scala:
##########
@@ -484,12 +484,20 @@ final class DataFrameWriter[T] private[sql](ds: 
Dataset[T]) extends sql.DataFram
           serde = None,
           external = false,
           constraints = Seq.empty)
+        val writeOptions = lookupV2Provider() match {

Review Comment:
   looking at the code around it, we only care when we overwrite an existing 
table using v1 `saveAsTable`, which means `tableOpt` should be define in this 
case.
   
   Then it makes more sense to have the marker interface extends `Table`, 
instead of `TableProvider`. It's also more flexible: if Delta Lake wants to get 
away from the legacy behavior, it can be done table by table.
   
   My proposal
   ```
   interface SupportsV1OverwriteWithSaveAsTable {
     static WRITE_OPTION_NAME = "..."
     boolean addMarkerWriteOption {return true;}
   }
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to