cloud-fan commented on pull request #31541:
URL: https://github.com/apache/spark/pull/31541#issuecomment-780319947
Technically people can replace the session catalog with a custom v2
implementation by setting `spark.sql.catalog.spark_catalog`, but this is not
the expected usage:
```
val V2_SESSION_CATALOG_IMPLEMENTATION =
buildConf(s"spark.sql.catalog.$SESSION_CATALOG_NAME")
.doc("A catalog implementation that will be used as the v2 interface
to Spark's built-in " +
s"v1 catalog: $SESSION_CATALOG_NAME. This catalog shares its
identifier namespace with " +
s"the $SESSION_CATALOG_NAME and must be consistent with it; for
example, if a table can " +
s"be loaded by the $SESSION_CATALOG_NAME, this catalog must also
return the table " +
s"metadata. To delegate operations to the $SESSION_CATALOG_NAME,
implementations can " +
"extend 'CatalogExtension'.")
.version("3.0.0")
.stringConf
.createOptional
```
Users are only expected to extend the session catalog, not replacing it.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]