RussellSpitzer commented on pull request #31541: URL: https://github.com/apache/spark/pull/31541#issuecomment-776954285
I mean I feel like we have been allowing folks to override spark_catalog for a while, and it has been documented in several places like here https://github.com/apache/spark/blob/9b875ceada60732899053fbd90728b4944d1c03d/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala#L2702-L2712 Which makes it sound to me like we are allowed to replace this, and the only requirement is that you would return the same tables that the original session catalog would. I also know a few other implementers who have been using this parameter to set the Session catalog Like https://docs.delta.io/latest/quick-start.html ``` pyspark --packages io.delta:delta-core_2.12:0.8.0 --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog" ``` Maybe we just should lock off this parameter and only allow for setting "default catalog" which is now available? ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
