rdblue commented on issue #24768: [SPARK-27919][SQL] Add v2 session catalog
URL: https://github.com/apache/spark/pull/24768#issuecomment-510561238
 
 
   @brkyvz, those two properties have different purposes. If you set the 
default catalog, it will be used for all identifiers with no specified catalog. 
That's not what we want for Spark 3.0, where we want the session catalog -- not 
the v2 session catalog -- to continue to be the default.
   
   The v2 session catalog property, `spark.sql.catalog.session`, sets the 
session catalog implementation, in case you want to replace the wrapper around 
the session catalog for different behavior. For example, you might want to 
extend the v2 session catalog and add special support for another table format.
   
   In short, the default catalog cannot be used to configure the v2 session 
catalog implementation without changing behavior and making all SQL go through 
the v2 code paths. And we still want a separate configuration to control the v2 
session catalog implementation.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to