cloud-fan commented on issue #25330: [SPARK-28565][SQL] DataFrameWriter 
saveAsTable support for V2 catalogs
URL: https://github.com/apache/spark/pull/25330#issuecomment-517564220
 
 
   I'd like to call a discussion about the v2 session catalog. I think v2 
session catalog is an extension of the builtin catalog(hive catalog). It allows 
users to add some hooks when Spark sends metadata requests to the builtin 
catalog.
   
   With this in mind, the table lookup logic should be:
   1. if the table identifier specifies the catalog name, look up the table 
from that catalog, and fail if table not found
   2. if the  table identifier do not have catalog name, then look up the table 
from default catalog
       - if the default catalog has been set(i.e. it's a custom catalog), look 
up table from it and fail if table not found
       - if the default catalog has not been set(i.e. it's the builtin 
catalog), look up table from v2 session catalog to apply the hooks
   
   Similar logic should be applied to table creation as well.
   
   I think this PR makes sense: if we need to go to v2 session catalog for 
table creation, we should do the same for table lookup as well. But I do agree 
with @jzhuge about 
https://github.com/apache/spark/pull/25330#discussion_r309950145
   
   what do you think? @rdblue @mccheah 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to