xianyinxin commented on a change in pull request #25115: [SPARK-28351][SQL] 
Support DELETE in DataSource V2
URL: https://github.com/apache/spark/pull/25115#discussion_r309984823
 
 

 ##########
 File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceResolution.scala
 ##########
 @@ -296,6 +310,17 @@ case class DataSourceResolution(
       orCreate = replace.orCreate)
   }
 
+  private def convertDeleteFrom(
+      catalog: TableCatalog,
+      identifier: Identifier,
+      delete: DeleteFromStatement): DeleteFromTable = {
+    val relation = CatalogV2Util.loadTable(catalog, identifier)
+        
.map(DataSourceV2Relation.create).getOrElse(UnresolvedRelation(delete.tableName))
+    val aliased = delete.tableAlias.map { SubqueryAlias(_, relation) 
}.getOrElse(relation)
+    val filter = Filter(delete.condition, aliased)
+    DeleteFromTable(aliased, filter)
 
 Review comment:
   Thanks @cloud-fan . In fact my first try is `DeleteFromTable(aliased, 
delete.condition)`, however, for case like delete with a subquery, the resolve 
would throw exception because both the `ResolveReferences` and 
`ResolveSubqueries` do not consider the case that a subquery in an expression 
(analyzer can deal with subquery in a filter). To fix this, we have two opts, 
1, use `Filter` other than `Expression` in `DeleteFromTable`, or 2, add 
additional resolve codes in `ResolveReferences` or `ResolveSubqueries`. I 
adopted the 1 one.
   What do you think?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to