rymurr commented on pull request #1843:
URL: https://github.com/apache/iceberg/pull/1843#issuecomment-734828999


   This is still a bit rough, doesn't pass 1 test and I haven't tested `ALTER` 
commands but I wanted to get it out for review early. 
   
   One key issue:
   It appears that deleting tables is only a `V1` Datasource operation from the 
Session Catalog currently in Spark (see 
[here](https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala#L366)).
 So when trying to drop a table via file path eg: ```DROP TABLE 
`file://path/to/iceberg/table\` ``` when using a Session catalog you get an 
error. This is because the V1 Session Catalog looks in Hive and doesn't find 
the table. 
   
   The `DROP` commands work when using a `HiveCatalog` and the Session Catalog 
because the table was created via a `HiveCatalog` and our `SparkSessionCatalog` 
and deleted via a V1 Session Catalog but both are talking to Hive so we don't 
notice.
   
   In my mind this means the Spark3 tests that run against the session catalog 
are (partially) broken as the `DROP` command won't work for anything other than 
a Hive based table.
   
   So:
   * should we document this limitation and ignore?
   * should we fix it on the Spark side?
   * Should we fix it on our side (I honestly don't know how to though)?
   
   @rdblue @aokolnychyi any thoughts?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to