xianyinxin commented on a change in pull request #25115: [SPARK-28351][SQL]
Support DELETE in DataSource V2
URL: https://github.com/apache/spark/pull/25115#discussion_r313441345
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceResolution.scala
##########
@@ -173,6 +173,19 @@ case class DataSourceResolution(
// only top-level adds are supported using AlterTableAddColumnsCommand
AlterTableAddColumnsCommand(table, newColumns.map(convertToStructField))
+ case DeleteFromStatement(AsTableIdentifier(table), tableAlias, condition)
=>
+ throw new AnalysisException(
Review comment:
If I understand correctly, one purpose of removing the first case is we can
execute delete on `parquet` format via this API (if we implement it later) as
@rdblue mentioned. The key point here is we resolve the table use
`V2SessionCatalog` **as the fallback catalog**. The original `resolveTable`
doesn't give any fallback-to-sessionCatalog mechanism (if no catalog found, it
will fallback to `resolveRelation`). So maybe we can modify `resolveTable` and
let it treat `V2SessionCatalog` as a try option:
```
case u @ UnresolvedRelation(CatalogObjectIdentifier(maybeCatalog, ident)) =>
maybeCatalog.orElse(sessionCatalog) match {
case Some(catalogPlugin) =>
loadTable(catalogPlugin,
ident).map(DataSourceV2Relation.create).getOrElse(u)
case None =>
u
}
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]