rdblue commented on a change in pull request #25115: [SPARK-28351][SQL] Support
DELETE in DataSource V2
URL: https://github.com/apache/spark/pull/25115#discussion_r313064244
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceResolution.scala
##########
@@ -173,6 +173,19 @@ case class DataSourceResolution(
// only top-level adds are supported using AlterTableAddColumnsCommand
AlterTableAddColumnsCommand(table, newColumns.map(convertToStructField))
+ case DeleteFromStatement(AsTableIdentifier(table), tableAlias, condition)
=>
+ throw new AnalysisException(
Review comment:
Since this always throws `AnalysisException`, I think this case should be
removed. Instead, the next case should match and the `V2SessionCatalog` should
be used. If the table loaded by the v2 session catalog doesn't support delete,
then conversion to physical plan will fail when `asDeletable` is called.
Then users can still call v2 deletes for formats like `parquet` that have a
v2 implementation that will work.
FYI @brkyvz.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]